[ 556.528434] env[66534]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 557.158634] env[66583]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 558.708766] env[66583]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=66583) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.709114] env[66583]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=66583) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.709157] env[66583]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=66583) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 558.709475] env[66583]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 558.710959] env[66583]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 558.826374] env[66583]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=66583) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 558.836767] env[66583]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=66583) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 558.937382] env[66583]: INFO nova.virt.driver [None req-c3d4fc98-9a7c-402a-8f89-6efce608784b None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 559.011502] env[66583]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 559.011669] env[66583]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 559.011756] env[66583]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=66583) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 562.269865] env[66583]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-b01ac87c-213e-418b-be8d-91e49583e435 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.286103] env[66583]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=66583) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 562.286326] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-490d5ece-ed8d-4fca-a50f-db64885cb838 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.321106] env[66583]: INFO oslo_vmware.api [-] Successfully established new session; session ID is b5715. [ 562.321265] env[66583]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.310s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.321906] env[66583]: INFO nova.virt.vmwareapi.driver [None req-c3d4fc98-9a7c-402a-8f89-6efce608784b None None] VMware vCenter version: 7.0.3 [ 562.325560] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e14e185f-fa94-4a35-b65a-425a0aa0af1f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.347131] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b02881e1-c446-4c36-bd41-150873f9cc28 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.353192] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b29ef5f5-e6a9-41f7-880a-6393ce623929 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.359881] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0dbbb93-86f1-46ce-bdc1-dad9fda49a06 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.373775] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9baa272a-df9b-48b8-b482-2800f1f3a074 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.379755] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a196898-0046-4e24-859b-8c01a6d476cc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.409369] env[66583]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-32d8fa15-9499-4e1d-8d45-53bfe301bc87 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.414726] env[66583]: DEBUG nova.virt.vmwareapi.driver [None req-c3d4fc98-9a7c-402a-8f89-6efce608784b None None] Extension org.openstack.compute already exists. {{(pid=66583) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 562.417363] env[66583]: INFO nova.compute.provider_config [None req-c3d4fc98-9a7c-402a-8f89-6efce608784b None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 562.435034] env[66583]: DEBUG nova.context [None req-c3d4fc98-9a7c-402a-8f89-6efce608784b None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),e114da18-3889-44ed-bcc6-d0521cd01fe4(cell1) {{(pid=66583) load_cells /opt/stack/nova/nova/context.py:464}} [ 562.436892] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.437124] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.437841] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.438190] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Acquiring lock "e114da18-3889-44ed-bcc6-d0521cd01fe4" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.438380] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Lock "e114da18-3889-44ed-bcc6-d0521cd01fe4" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.439343] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Lock "e114da18-3889-44ed-bcc6-d0521cd01fe4" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.451779] env[66583]: DEBUG oslo_db.sqlalchemy.engines [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=66583) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 562.452164] env[66583]: DEBUG oslo_db.sqlalchemy.engines [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=66583) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 562.459133] env[66583]: ERROR nova.db.main.api [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 562.459133] env[66583]: result = function(*args, **kwargs) [ 562.459133] env[66583]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 562.459133] env[66583]: return func(*args, **kwargs) [ 562.459133] env[66583]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 562.459133] env[66583]: result = fn(*args, **kwargs) [ 562.459133] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 562.459133] env[66583]: return f(*args, **kwargs) [ 562.459133] env[66583]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 562.459133] env[66583]: return db.service_get_minimum_version(context, binaries) [ 562.459133] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 562.459133] env[66583]: _check_db_access() [ 562.459133] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 562.459133] env[66583]: stacktrace = ''.join(traceback.format_stack()) [ 562.459133] env[66583]: [ 562.459875] env[66583]: ERROR nova.db.main.api [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 562.459875] env[66583]: result = function(*args, **kwargs) [ 562.459875] env[66583]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 562.459875] env[66583]: return func(*args, **kwargs) [ 562.459875] env[66583]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 562.459875] env[66583]: result = fn(*args, **kwargs) [ 562.459875] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 562.459875] env[66583]: return f(*args, **kwargs) [ 562.459875] env[66583]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 562.459875] env[66583]: return db.service_get_minimum_version(context, binaries) [ 562.459875] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 562.459875] env[66583]: _check_db_access() [ 562.459875] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 562.459875] env[66583]: stacktrace = ''.join(traceback.format_stack()) [ 562.459875] env[66583]: [ 562.460219] env[66583]: WARNING nova.objects.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Failed to get minimum service version for cell e114da18-3889-44ed-bcc6-d0521cd01fe4 [ 562.460372] env[66583]: WARNING nova.objects.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 562.460789] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Acquiring lock "singleton_lock" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 562.460947] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Acquired lock "singleton_lock" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 562.461199] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Releasing lock "singleton_lock" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 562.461541] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Full set of CONF: {{(pid=66583) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 562.461689] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ******************************************************************************** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 562.461816] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] Configuration options gathered from: {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 562.461948] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 562.462149] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 562.462309] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ================================================================================ {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 562.462510] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] allow_resize_to_same_host = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.462679] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] arq_binding_timeout = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.462810] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] backdoor_port = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.462934] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] backdoor_socket = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463117] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] block_device_allocate_retries = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463283] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] block_device_allocate_retries_interval = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463453] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cert = self.pem {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463642] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463838] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute_monitors = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.463997] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] config_dir = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464181] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] config_drive_format = iso9660 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464319] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464483] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] config_source = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464651] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] console_host = devstack {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464815] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] control_exchange = nova {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.464972] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cpu_allocation_ratio = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.465146] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] daemon = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.465316] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] debug = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.465477] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] default_access_ip_network_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.465676] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] default_availability_zone = nova {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.465913] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] default_ephemeral_format = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.466185] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.466387] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] default_schedule_zone = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.466578] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] disk_allocation_ratio = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.466749] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] enable_new_services = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.466929] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] enabled_apis = ['osapi_compute'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467108] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] enabled_ssl_apis = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467275] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] flat_injected = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467439] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] force_config_drive = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467600] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] force_raw_images = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467767] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] graceful_shutdown_timeout = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.467927] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] heal_instance_info_cache_interval = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.468158] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] host = cpu-1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.468332] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] initial_cpu_allocation_ratio = 4.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.468496] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] initial_disk_allocation_ratio = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.468656] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] initial_ram_allocation_ratio = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.468865] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469039] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_build_timeout = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469206] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_delete_interval = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469374] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_format = [instance: %(uuid)s] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469541] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_name_template = instance-%08x {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469702] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_usage_audit = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.469873] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_usage_audit_period = month {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470046] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470214] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] instances_path = /opt/stack/data/nova/instances {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470380] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] internal_service_availability_zone = internal {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470551] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] key = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470720] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] live_migration_retry_count = 30 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.470883] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_config_append = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471057] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471269] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_dir = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471401] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471531] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_options = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471693] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_rotate_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.471861] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_rotate_interval_type = days {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472036] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] log_rotation_type = none {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472169] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472315] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472491] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472655] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472783] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.472946] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] long_rpc_timeout = 1800 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473116] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_concurrent_builds = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473278] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_concurrent_live_migrations = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473440] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_concurrent_snapshots = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473616] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_local_block_devices = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473792] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_logfile_count = 30 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.473950] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] max_logfile_size_mb = 200 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474125] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] maximum_instance_delete_attempts = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474297] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metadata_listen = 0.0.0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474466] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metadata_listen_port = 8775 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474634] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metadata_workers = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474796] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] migrate_max_retries = -1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.474966] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] mkisofs_cmd = genisoimage {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475189] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] my_block_storage_ip = 10.180.1.21 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475324] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] my_ip = 10.180.1.21 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475488] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] network_allocate_retries = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475667] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475832] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] osapi_compute_listen = 0.0.0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.475991] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] osapi_compute_listen_port = 8774 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476171] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] osapi_compute_unique_server_name_scope = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476337] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] osapi_compute_workers = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476497] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] password_length = 12 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476656] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] periodic_enable = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476820] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] periodic_fuzzy_delay = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.476997] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] pointer_model = usbtablet {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477179] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] preallocate_images = none {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477341] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] publish_errors = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477475] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] pybasedir = /opt/stack/nova {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477637] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ram_allocation_ratio = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477818] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rate_limit_burst = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.477992] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rate_limit_except_level = CRITICAL {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.478168] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rate_limit_interval = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.478330] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reboot_timeout = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.478523] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reclaim_instance_interval = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.478687] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] record = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.478854] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reimage_timeout_per_gb = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479028] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] report_interval = 120 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479196] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rescue_timeout = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479354] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reserved_host_cpus = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479515] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reserved_host_disk_mb = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479672] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reserved_host_memory_mb = 512 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479830] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] reserved_huge_pages = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.479988] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] resize_confirm_window = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480162] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] resize_fs_using_block_device = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480324] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] resume_guests_state_on_host_boot = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480495] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480654] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rpc_response_timeout = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480817] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] run_external_periodic_tasks = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.480987] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] running_deleted_instance_action = reap {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481166] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] running_deleted_instance_poll_interval = 1800 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481350] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] running_deleted_instance_timeout = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481520] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler_instance_sync_interval = 120 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481655] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_down_time = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481826] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] servicegroup_driver = db {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.481983] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] shelved_offload_time = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.482165] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] shelved_poll_interval = 3600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.482369] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] shutdown_timeout = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.482537] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] source_is_ipv6 = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.482693] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ssl_only = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.482985] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483123] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] sync_power_state_interval = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483286] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] sync_power_state_pool_size = 1000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483455] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] syslog_log_facility = LOG_USER {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483614] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] tempdir = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483774] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] timeout_nbd = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.483942] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] transport_url = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484116] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] update_resources_interval = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484280] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_cow_images = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484439] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_eventlog = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484599] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_journal = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484754] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_json = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.484911] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_rootwrap_daemon = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485079] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_stderr = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485244] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] use_syslog = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485404] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vcpu_pin_set = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485571] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plugging_is_fatal = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485738] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plugging_timeout = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.485903] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] virt_mkfs = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.486074] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] volume_usage_poll_interval = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.486239] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] watch_log_file = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.486409] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] web = /usr/share/spice-html5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 562.486594] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_concurrency.disable_process_locking = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.486894] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487086] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487258] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487432] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_metrics.metrics_process_name = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487600] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487766] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.487949] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.auth_strategy = keystone {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488130] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.compute_link_prefix = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488308] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488483] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.dhcp_domain = novalocal {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488885] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.enable_instance_password = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488885] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.glance_link_prefix = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.488981] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489144] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.instance_list_cells_batch_strategy = distributed {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489309] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.instance_list_per_project_cells = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489472] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.list_records_by_skipping_down_cells = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489635] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.local_metadata_per_cell = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489802] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.max_limit = 1000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.489968] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.metadata_cache_expiration = 15 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.490154] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.neutron_default_tenant_id = default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.490322] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.use_forwarded_for = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.490513] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.use_neutron_default_nets = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.490694] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.490862] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_dynamic_failure_fatal = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491041] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491226] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_dynamic_ssl_certfile = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491420] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_dynamic_targets = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491594] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_jsonfile_path = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491777] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api.vendordata_providers = ['StaticJSON'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.491969] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.backend = dogpile.cache.memcached {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492149] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.backend_argument = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492326] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.config_prefix = cache.oslo {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492494] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.dead_timeout = 60.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492661] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.debug_cache_backend = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492824] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.enable_retry_client = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.492988] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.enable_socket_keepalive = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.493175] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.enabled = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.493346] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.expiration_time = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.493540] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.hashclient_retry_attempts = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.493721] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.hashclient_retry_delay = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.493887] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_dead_retry = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494067] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_password = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494237] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494401] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494562] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_pool_maxsize = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494721] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_pool_unused_timeout = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.494881] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_sasl_enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495068] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_servers = ['localhost:11211'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495238] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_socket_timeout = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495409] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.memcache_username = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495574] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.proxies = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495736] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.retry_attempts = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.495897] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.retry_delay = 0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496068] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.socket_keepalive_count = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496235] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.socket_keepalive_idle = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496409] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.socket_keepalive_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496589] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.tls_allowed_ciphers = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496749] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.tls_cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.496906] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.tls_certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497080] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.tls_enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497243] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cache.tls_keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497415] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497588] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.auth_type = password {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497748] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.497925] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.catalog_info = volumev3::publicURL {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498101] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498273] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498439] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.cross_az_attach = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498603] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.debug = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498763] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.endpoint_template = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.498926] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.http_retries = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499103] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499285] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499475] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.os_region_name = RegionOne {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499643] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499807] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cinder.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.499978] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500161] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.cpu_dedicated_set = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500322] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.cpu_shared_set = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500487] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.image_type_exclude_list = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500650] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.live_migration_wait_for_vif_plug = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500814] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.max_concurrent_disk_ops = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.500976] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.max_disk_devices_to_attach = -1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.501154] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.501353] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.501527] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.resource_provider_association_refresh = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.501691] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.shutdown_retry_interval = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.501873] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.502060] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] conductor.workers = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.502267] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] console.allowed_origins = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.502472] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] console.ssl_ciphers = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.502659] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] console.ssl_minimum_version = default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.502837] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] consoleauth.token_ttl = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503014] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503190] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503345] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503506] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503664] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503822] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.503984] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504157] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504320] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504479] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504635] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504793] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.504963] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.service_type = accelerator {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505151] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505315] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505477] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505634] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505815] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.505977] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] cyborg.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.506176] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.backend = sqlalchemy {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.506367] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.connection = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.506539] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.connection_debug = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.506712] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.connection_parameters = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.506878] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.connection_recycle_time = 3600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507056] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.connection_trace = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507228] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.db_inc_retry_interval = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507394] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.db_max_retries = 20 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507557] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.db_max_retry_interval = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507719] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.db_retry_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.507889] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.max_overflow = 50 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.508061] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.max_pool_size = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.508233] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.max_retries = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.508398] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.mysql_enable_ndb = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.508577] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.mysql_sql_mode = TRADITIONAL {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.508837] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.mysql_wsrep_sync_wait = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509053] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.pool_timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509240] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.retry_interval = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509405] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.slave_connection = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509574] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.sqlite_synchronous = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509736] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] database.use_db_reconnect = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.509921] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.backend = sqlalchemy {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510112] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.connection = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510287] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.connection_debug = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510461] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.connection_parameters = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510626] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.connection_recycle_time = 3600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510794] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.connection_trace = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.510960] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.db_inc_retry_interval = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.511139] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.db_max_retries = 20 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.511333] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.db_max_retry_interval = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.511508] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.db_retry_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.511680] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.max_overflow = 50 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.511845] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.max_pool_size = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.512026] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.max_retries = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.512189] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.mysql_enable_ndb = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.513799] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.513995] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.mysql_wsrep_sync_wait = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.514192] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.pool_timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.514378] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.retry_interval = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.514552] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.slave_connection = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.514730] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] api_database.sqlite_synchronous = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.514913] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] devices.enabled_mdev_types = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.515111] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.515287] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ephemeral_storage_encryption.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.515460] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ephemeral_storage_encryption.key_size = 512 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.515668] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.api_servers = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.515847] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516028] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516206] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516369] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516534] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516702] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.debug = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.516874] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.default_trusted_certificate_ids = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517052] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.enable_certificate_validation = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517223] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.enable_rbd_download = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517389] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517558] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517724] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.517886] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518058] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518229] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.num_retries = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518418] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.rbd_ceph_conf = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518587] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.rbd_connect_timeout = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518760] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.rbd_pool = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.518931] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.rbd_user = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519106] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519278] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519448] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.service_type = image {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519614] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519776] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.519936] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520108] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520293] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520462] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.verify_glance_signatures = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520624] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] glance.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520793] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] guestfs.debug = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.520965] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.config_drive_cdrom = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521146] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.config_drive_inject_password = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521317] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521484] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.enable_instance_metrics_collection = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521648] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.enable_remotefx = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521819] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.instances_path_share = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.521986] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.iscsi_initiator_list = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.522165] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.limit_cpu_features = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.522378] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.522575] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.522752] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.power_state_check_timeframe = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.522918] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.power_state_event_polling_interval = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523104] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523275] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.use_multipath_io = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523445] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.volume_attach_retry_count = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523608] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.volume_attach_retry_interval = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523769] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.vswitch_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.523931] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.524115] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] mks.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.524473] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.524670] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.manager_interval = 2400 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.524843] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.precache_concurrency = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525025] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.remove_unused_base_images = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525203] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525373] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525555] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] image_cache.subdirectory_name = _base {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525732] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.api_max_retries = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.525899] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.api_retry_interval = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526071] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526238] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.auth_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526427] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526615] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526787] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.526950] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527127] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527291] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527459] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527620] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527781] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.527939] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528113] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.partition_key = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528282] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.peer_list = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528445] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528610] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.serial_console_state_timeout = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528770] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.528943] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.service_type = baremetal {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529123] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529289] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529450] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529612] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529799] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.529963] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ironic.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.530161] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.530435] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] key_manager.fixed_key = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.530622] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.530788] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.barbican_api_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.530951] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.barbican_endpoint = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.531146] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.barbican_endpoint_type = public {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.barbican_region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532581] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.number_of_retries = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532809] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.retry_delay = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532809] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.send_service_user_token = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.532864] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533020] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533186] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.verify_ssl = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533348] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican.verify_ssl_path = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533515] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533691] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.auth_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.533851] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534015] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534184] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534348] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534782] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534782] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534846] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] barbican_service_user.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.534974] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.approle_role_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535145] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.approle_secret_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535306] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535466] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535632] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535795] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.535953] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536140] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.kv_mountpoint = secret {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536308] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.kv_version = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536470] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.namespace = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536632] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.root_token_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536795] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.536954] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.ssl_ca_crt_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.537125] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.537291] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.use_ssl = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.537462] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.537686] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.537872] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538053] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538220] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538383] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538543] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538705] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.538865] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539033] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539198] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539359] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539518] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539690] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.service_type = identity {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.539854] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540016] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540183] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540342] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540526] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540688] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] keystone.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.540892] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.connection_uri = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541066] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_mode = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541256] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_model_extra_flags = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541440] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_models = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541619] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_power_governor_high = performance {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541791] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_power_governor_low = powersave {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.541956] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_power_management = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.542146] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.542357] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.device_detach_attempts = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.542552] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.device_detach_timeout = 20 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.542726] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.disk_cachemodes = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.542889] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.disk_prefix = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543069] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.enabled_perf_events = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543245] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.file_backed_memory = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543418] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.gid_maps = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543577] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.hw_disk_discard = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543739] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.hw_machine_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.543930] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_rbd_ceph_conf = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544118] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544289] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544463] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_rbd_glance_store_name = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544634] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_rbd_pool = rbd {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544805] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_type = default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.544966] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.images_volume_group = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545145] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.inject_key = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545310] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.inject_partition = -2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545483] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.inject_password = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545650] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.iscsi_iface = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545815] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.iser_use_multipath = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.545982] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_bandwidth = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.546161] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_completion_timeout = 800 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.546329] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_downtime = 500 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.546495] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_downtime_delay = 75 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.546657] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_downtime_steps = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.546876] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_inbound_addr = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547091] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_permit_auto_converge = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547268] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_permit_post_copy = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547441] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_scheme = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547619] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_timeout_action = abort {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547787] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_tunnelled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.547947] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_uri = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.548125] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.live_migration_with_native_tls = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.548291] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.max_queues = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.548464] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.mem_stats_period_seconds = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.548631] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.nfs_mount_options = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.548942] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549129] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_aoe_discover_tries = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549302] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_iser_scan_tries = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549467] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_memory_encrypted_guests = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549633] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_nvme_discover_tries = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549797] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_pcie_ports = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.549996] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.num_volume_scan_tries = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.550200] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.pmem_namespaces = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.550366] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.quobyte_client_cfg = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.550661] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.550835] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rbd_connect_timeout = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.551113] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.551318] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.551517] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rbd_secret_uuid = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.551696] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rbd_user = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.551865] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.realtime_scheduler_priority = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552075] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.remote_filesystem_transport = ssh {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552251] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rescue_image_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552415] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rescue_kernel_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552574] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rescue_ramdisk_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552743] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rng_dev_path = /dev/urandom {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.552912] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.rx_queue_size = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.553360] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.smbfs_mount_options = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.553509] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.553722] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.snapshot_compression = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.553923] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.snapshot_image_format = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.554183] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.554394] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.sparse_logical_volumes = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.554619] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.swtpm_enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.554806] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.swtpm_group = tss {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.554991] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.swtpm_user = tss {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.555188] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.sysinfo_serial = unique {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.555353] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.tx_queue_size = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.555521] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.uid_maps = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.555686] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.use_virtio_for_bridges = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.555861] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.virt_type = kvm {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556041] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.volume_clear = zero {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556212] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.volume_clear_size = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556391] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.volume_use_multipath = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556592] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_cache_path = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556772] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.556943] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_mount_group = qemu {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.557125] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_mount_opts = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.557300] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.557579] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.557759] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.vzstorage_mount_user = stack {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.557928] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558123] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558304] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.auth_type = password {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558470] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558633] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558797] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.558959] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.559132] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.559306] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.default_floating_pool = public {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.559469] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.559680] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.extension_sync_interval = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.559893] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.http_retries = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560097] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560273] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560441] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560615] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.metadata_proxy_shared_secret = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560776] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.560949] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.ovs_bridge = br-int {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.561132] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.physnets = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.561344] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.region_name = RegionOne {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.561527] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.service_metadata_proxy = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.561694] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.561867] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.service_type = network {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562042] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562230] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562390] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562566] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562754] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.562963] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] neutron.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.563164] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] notifications.bdms_in_notifications = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.563349] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] notifications.default_level = INFO {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.563571] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] notifications.notification_format = unversioned {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.563771] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] notifications.notify_on_state_change = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.563872] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564059] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] pci.alias = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564236] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] pci.device_spec = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564405] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] pci.report_in_placement = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564585] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564761] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.auth_type = password {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.564932] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.auth_url = http://10.180.1.21/identity {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565107] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565271] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565441] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565602] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565762] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.565922] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.default_domain_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566123] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.default_domain_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566312] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.domain_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566484] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.domain_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566641] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566899] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.566983] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567137] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567295] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567465] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.password = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567625] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.project_domain_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567800] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.project_domain_name = Default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.567969] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.project_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568160] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.project_name = service {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568334] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.region_name = RegionOne {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568499] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568668] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.service_type = placement {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568834] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.568994] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.569168] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.569329] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.system_scope = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.569528] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.569700] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.trust_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.569870] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.user_domain_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570047] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.user_domain_name = Default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570219] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.user_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570395] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.username = placement {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570580] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570745] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] placement.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.570923] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.cores = 20 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.571104] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.count_usage_from_placement = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.571313] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.571496] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.injected_file_content_bytes = 10240 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.571666] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.injected_file_path_length = 255 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.571833] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.injected_files = 5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572009] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.instances = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572182] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.key_pairs = 100 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572363] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.metadata_items = 128 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572568] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.ram = 51200 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572762] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.recheck_quota = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.572935] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.server_group_members = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.573114] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] quota.server_groups = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.573288] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rdp.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.573614] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.573801] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.573974] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.574154] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.image_metadata_prefilter = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.574354] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.574532] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.max_attempts = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.574698] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.max_placement_results = 1000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.574865] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575040] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.query_placement_for_availability_zone = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575210] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.query_placement_for_image_type_support = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575376] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575571] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] scheduler.workers = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575750] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.575955] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.576159] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.576334] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.576511] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.576677] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.576842] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577045] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577221] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.host_subset_size = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577400] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.image_properties_default_architecture = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577598] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577771] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.isolated_hosts = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.577940] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.isolated_images = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578121] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.max_instances_per_host = 50 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578291] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578457] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.pci_in_placement = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578623] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578784] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.578953] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.579165] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.579339] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.579511] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.579674] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.track_instance_changes = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.579851] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.580033] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metrics.required = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.580205] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metrics.weight_multiplier = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.580372] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metrics.weight_of_unavailable = -10000.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.580540] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] metrics.weight_setting = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.580871] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581062] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581266] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.port_range = 10000:20000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581450] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581625] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581828] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] serial_console.serialproxy_port = 6083 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.581961] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582145] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.auth_type = password {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582321] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582507] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582673] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582835] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.582990] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.583175] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.send_service_user_token = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.583340] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.583501] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] service_user.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.583670] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.agent_enabled = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.583846] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.584182] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.584366] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.html5proxy_host = 0.0.0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.584542] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.html5proxy_port = 6082 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.584705] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.image_compression = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.584863] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.jpeg_compression = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585034] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.playback_compression = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585211] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.server_listen = 127.0.0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585385] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585547] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.streaming_mode = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585704] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] spice.zlib_compression = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.585868] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] upgrade_levels.baseapi = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586034] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] upgrade_levels.cert = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586208] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] upgrade_levels.compute = auto {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586367] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] upgrade_levels.conductor = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586533] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] upgrade_levels.scheduler = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586702] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.586864] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.auth_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587029] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587189] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587350] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587512] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587669] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587828] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.587982] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vendordata_dynamic_auth.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588173] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.api_retry_count = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588335] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.ca_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588506] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.cache_prefix = devstack-image-cache {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588672] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.cluster_name = testcl1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588836] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.connection_pool_size = 10 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.588992] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.console_delay_seconds = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.589174] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.datastore_regex = ^datastore.* {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.589386] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.589559] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.host_password = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.589724] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.host_port = 443 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.589890] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.host_username = administrator@vsphere.local {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590066] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.insecure = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590229] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.integration_bridge = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590393] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.maximum_objects = 100 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590551] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.pbm_default_policy = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590713] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.pbm_enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.590869] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.pbm_wsdl_location = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591043] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591207] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.serial_port_proxy_uri = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591397] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.serial_port_service_uri = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591572] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.task_poll_interval = 0.5 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591746] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.use_linked_clone = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.591917] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.vnc_keymap = en-us {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.592097] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.vnc_port = 5900 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.592281] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vmware.vnc_port_total = 10000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.592503] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.auth_schemes = ['none'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.592689] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.592986] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.593188] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.593363] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.novncproxy_port = 6080 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.593542] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.server_listen = 127.0.0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.593713] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.593873] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.vencrypt_ca_certs = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594065] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.vencrypt_client_cert = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594200] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vnc.vencrypt_client_key = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594379] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594542] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_deep_image_inspection = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594703] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_fallback_pcpu_query = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.594861] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_group_policy_check_upcall = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595031] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595197] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.disable_rootwrap = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595357] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.enable_numa_live_migration = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595519] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595677] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595834] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.handle_virt_lifecycle_events = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.595994] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.libvirt_disable_apic = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.596167] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.never_download_image_if_on_rbd = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.596351] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.596554] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.596728] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.596895] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597073] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597243] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597410] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597571] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597736] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.597919] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598099] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.client_socket_timeout = 900 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598271] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.default_pool_size = 1000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598441] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.keep_alive = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598609] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.max_header_line = 16384 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598770] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.secure_proxy_ssl_header = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.598932] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.ssl_ca_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599104] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.ssl_cert_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599269] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.ssl_key_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599689] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.tcp_keepidle = 600 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599689] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599778] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] zvm.ca_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.599925] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] zvm.cloud_connector_url = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.600228] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.600430] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] zvm.reachable_timeout = 300 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.600622] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.enforce_new_defaults = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.600796] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.enforce_scope = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.600974] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.policy_default_rule = default {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.601176] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.601382] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.policy_file = policy.yaml {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.601564] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.601731] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.601894] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602066] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602236] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602410] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602587] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602764] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.connection_string = messaging:// {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.602931] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.enabled = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603112] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.es_doc_type = notification {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603280] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.es_scroll_size = 10000 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603451] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.es_scroll_time = 2m {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603613] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.filter_error_trace = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603781] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.hmac_keys = SECRET_KEY {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.603949] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.sentinel_service_name = mymaster {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604160] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.socket_timeout = 0.1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604299] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] profiler.trace_sqlalchemy = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604468] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] remote_debug.host = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604628] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] remote_debug.port = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604806] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.604970] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605147] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605314] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605480] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605644] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605803] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.605965] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606138] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606298] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606476] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606639] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606806] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.606976] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607152] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607332] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607503] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607668] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607834] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.607999] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.608177] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.608345] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.608557] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.608733] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.608902] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609083] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609264] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609441] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609604] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609777] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.609947] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_rabbit.ssl_version = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.610150] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.610321] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_notifications.retry = -1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.610512] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.610685] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_messaging_notifications.transport_url = **** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.610857] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.auth_section = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611029] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.auth_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611193] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.cafile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611372] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.certfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611542] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.collect_timing = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611708] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.connect_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.611867] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.connect_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612035] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.endpoint_id = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612200] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.endpoint_override = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612389] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.insecure = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612556] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.keyfile = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612715] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.max_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.612876] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.min_version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613038] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.region_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613200] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.service_name = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613360] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.service_type = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613529] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.split_loggers = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613687] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.status_code_retries = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.613848] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.status_code_retry_delay = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614016] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.timeout = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614180] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.valid_interfaces = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614337] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_limit.version = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614503] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_reports.file_event_handler = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614667] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_reports.file_event_handler_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614824] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] oslo_reports.log_dir = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.614995] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615171] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.group = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615331] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615499] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615664] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615821] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_linux_bridge_privileged.user = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.615992] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616166] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.group = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616328] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.helper_command = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616493] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616654] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616814] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] vif_plug_ovs_privileged.user = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.616980] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.flat_interface = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.617174] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.617347] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.617518] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.617688] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.617852] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618027] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618195] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_linux_bridge.vlan_interface = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618376] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.isolate_vif = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618545] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618712] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.618882] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619060] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.ovsdb_interface = native {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619230] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_vif_ovs.per_port_bridge = False {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619399] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_brick.lock_path = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619561] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619722] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] os_brick.wait_mpath_device_interval = 1 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.619890] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.capabilities = [21] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620066] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.group = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620229] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.helper_command = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620428] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620640] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.thread_pool_size = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620808] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] privsep_osbrick.user = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.620983] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621158] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.group = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621335] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.helper_command = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621512] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621675] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.thread_pool_size = 8 {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621832] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] nova_sys_admin.user = None {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 562.621961] env[66583]: DEBUG oslo_service.service [None req-f3ec4232-1930-4740-a71b-bae3c8d9ae25 None None] ******************************************************************************** {{(pid=66583) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 562.622386] env[66583]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 562.630675] env[66583]: INFO nova.virt.node [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Generated node identity 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc [ 562.630908] env[66583]: INFO nova.virt.node [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Wrote node identity 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc to /opt/stack/data/n-cpu-1/compute_id [ 562.641835] env[66583]: WARNING nova.compute.manager [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Compute nodes ['19ca8ba5-bd08-4664-b5ea-7bb8423a24bc'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 562.672976] env[66583]: INFO nova.compute.manager [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 562.692053] env[66583]: WARNING nova.compute.manager [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 562.692336] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.692533] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.692680] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 562.692833] env[66583]: DEBUG nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 562.693936] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5debdac0-451e-418e-88f0-2f9313c37589 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.702795] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705dabf2-092f-4a0b-a283-56c3c661070d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.716454] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7767c8a1-6e8b-4462-bb7c-626981b28a68 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.722503] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00540cd8-cfac-45e0-9214-8352158d9801 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.750601] env[66583]: DEBUG nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180946MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 562.750777] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 562.750918] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 562.762614] env[66583]: WARNING nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] No compute node record for cpu-1:19ca8ba5-bd08-4664-b5ea-7bb8423a24bc: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc could not be found. [ 562.774833] env[66583]: INFO nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc [ 562.824042] env[66583]: DEBUG nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 562.824247] env[66583]: DEBUG nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 562.925172] env[66583]: INFO nova.scheduler.client.report [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] [req-2fc6fb63-3b49-470c-a3f8-dac7140f9708] Created resource provider record via placement API for resource provider with UUID 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 562.941467] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98780d2a-b0ae-4a0e-8e4f-2a444b5f5fd0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.948903] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b54bcd4f-d051-4764-880a-bcb3396adec2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.978100] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c03dadb1-f3b7-48a3-8ebe-c0fc928944f8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.984797] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b776a78-bf9d-41ff-946b-749933e648fc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 562.997278] env[66583]: DEBUG nova.compute.provider_tree [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Updating inventory in ProviderTree for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 563.031963] env[66583]: DEBUG nova.scheduler.client.report [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Updated inventory for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 563.032400] env[66583]: DEBUG nova.compute.provider_tree [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Updating resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc generation from 0 to 1 during operation: update_inventory {{(pid=66583) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 563.032400] env[66583]: DEBUG nova.compute.provider_tree [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Updating inventory in ProviderTree for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 563.074728] env[66583]: DEBUG nova.compute.provider_tree [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Updating resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc generation from 1 to 2 during operation: update_traits {{(pid=66583) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 563.092980] env[66583]: DEBUG nova.compute.resource_tracker [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 563.093125] env[66583]: DEBUG oslo_concurrency.lockutils [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.342s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 563.093300] env[66583]: DEBUG nova.service [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Creating RPC server for service compute {{(pid=66583) start /opt/stack/nova/nova/service.py:182}} [ 563.107392] env[66583]: DEBUG nova.service [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] Join ServiceGroup membership for this service compute {{(pid=66583) start /opt/stack/nova/nova/service.py:199}} [ 563.107605] env[66583]: DEBUG nova.servicegroup.drivers.db [None req-0cbc4406-79a7-43d6-b879-448740cbbb14 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=66583) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 592.110063] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 592.121166] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Getting list of instances from cluster (obj){ [ 592.121166] env[66583]: value = "domain-c8" [ 592.121166] env[66583]: _type = "ClusterComputeResource" [ 592.121166] env[66583]: } {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 592.122521] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c760337b-494a-4b12-8687-fe02174e4cf7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.132012] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Got total of 0 instances {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 592.132229] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 592.132525] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Getting list of instances from cluster (obj){ [ 592.132525] env[66583]: value = "domain-c8" [ 592.132525] env[66583]: _type = "ClusterComputeResource" [ 592.132525] env[66583]: } {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 592.133360] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a946406-4446-47c6-a01e-22d18bee27b5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.140991] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Got total of 0 instances {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 599.589480] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.591549] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.608850] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 599.696880] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 599.697157] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 599.698984] env[66583]: INFO nova.compute.claims [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 599.808936] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9ba13b0-6cb2-4d3d-8886-e876215d35d5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.818568] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3e97e21-afa8-4181-9b15-e43ddc48e5db {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.849908] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6598939e-5554-4326-9a0d-45d9d37fd782 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.857991] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3fc2082-6b96-4b0e-a249-8e8676b14ad7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 599.873400] env[66583]: DEBUG nova.compute.provider_tree [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 599.881205] env[66583]: DEBUG nova.scheduler.client.report [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 599.896941] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.198s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 599.896941] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 599.928698] env[66583]: DEBUG nova.compute.utils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 599.930241] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 599.930549] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 599.939904] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 600.004630] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 602.464637] env[66583]: DEBUG nova.policy [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb173cc939304a8ea45f6fef11acfcbd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6b54f8d0f0984c0fa2ceb9f4e7fdb223', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 603.711412] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Successfully created port: 94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 604.586085] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 604.586396] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 604.586555] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 604.586739] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 604.586893] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 604.587073] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 604.587361] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 604.587699] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 604.587928] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 604.589827] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 604.589827] env[66583]: DEBUG nova.virt.hardware [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 604.589827] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91821cdc-ea59-4d65-999d-aa271f26cf58 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.598770] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d680cdc4-08cf-4f21-b846-c66f742b98e6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 604.617310] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0506a8a1-1b6f-41ae-b4d5-5ada077a309b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.442590] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "fac7d6a8-d74e-4130-8068-236289d5d616" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.442901] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "fac7d6a8-d74e-4130-8068-236289d5d616" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.470643] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 605.535398] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 605.535621] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 605.537133] env[66583]: INFO nova.compute.claims [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 605.664501] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af2519ab-71fe-4784-ba28-5d7530177129 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.672914] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98e02070-9ad1-4913-92eb-2b2dcc5084ef {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.711384] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349c572f-8c07-4839-b5cb-5c9208775fb5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.719569] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cc6fac7-fb8c-4ccc-ae1f-f9592f64c848 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 605.736874] env[66583]: DEBUG nova.compute.provider_tree [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 605.747028] env[66583]: DEBUG nova.scheduler.client.report [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 605.762544] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.227s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 605.763197] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 605.803076] env[66583]: DEBUG nova.compute.utils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 605.804157] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 605.804382] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 605.821462] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 605.908313] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 605.929365] env[66583]: DEBUG nova.policy [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a2f40b0c6be4850b45d7af29b0ef446', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f54e44041809424ba5090e357365305c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 606.145709] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Successfully updated port: 94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 606.161670] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 606.161862] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquired lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 606.162267] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 606.245891] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 606.362227] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Successfully created port: 73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 606.477632] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Updating instance_info_cache with network_info: [{"id": "94295550-570e-4909-8b04-eda0639d4b0f", "address": "fa:16:3e:df:e6:db", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94295550-57", "ovs_interfaceid": "94295550-570e-4909-8b04-eda0639d4b0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 606.494210] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Releasing lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 606.496366] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Instance network_info: |[{"id": "94295550-570e-4909-8b04-eda0639d4b0f", "address": "fa:16:3e:df:e6:db", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94295550-57", "ovs_interfaceid": "94295550-570e-4909-8b04-eda0639d4b0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 606.496773] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:df:e6:db', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '94295550-570e-4909-8b04-eda0639d4b0f', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 606.508779] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.510058] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e723b9af-d91f-4a1f-94a1-6c9df7a859ff {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.524374] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Created folder: OpenStack in parent group-v4. [ 606.524591] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating folder: Project (6b54f8d0f0984c0fa2ceb9f4e7fdb223). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.524813] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e43890e-0a94-4b4c-94a1-6976ef23f204 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.535169] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Created folder: Project (6b54f8d0f0984c0fa2ceb9f4e7fdb223) in parent group-v693485. [ 606.536039] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating folder: Instances. Parent ref: group-v693486. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 606.536039] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d2b218b9-b5bf-46c8-b4c2-356d5a42fc99 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.547100] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Created folder: Instances in parent group-v693486. [ 606.547100] env[66583]: DEBUG oslo.service.loopingcall [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 606.547100] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 606.547100] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2e416226-b7eb-4562-af36-3c63f196e518 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 606.566184] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 606.566184] env[66583]: value = "task-3470218" [ 606.566184] env[66583]: _type = "Task" [ 606.566184] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 606.574799] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470218, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.076984] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470218, 'name': CreateVM_Task, 'duration_secs': 0.373364} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 607.077317] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.092082] env[66583]: DEBUG oslo_vmware.service [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0c1590e-32d6-4acb-b556-f9dd634fa51b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.099390] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.099613] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.100262] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 607.100527] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2516f594-9323-44d5-9d67-5d0ed2073d47 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.105602] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Waiting for the task: (returnval){ [ 607.105602] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]522cdc55-b2ca-7922-7b00-2507aacc6f2b" [ 607.105602] env[66583]: _type = "Task" [ 607.105602] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.113873] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]522cdc55-b2ca-7922-7b00-2507aacc6f2b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 607.307119] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 607.307119] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 607.307119] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 607.307858] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 607.308199] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 607.308724] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 607.309363] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 607.309679] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 607.310062] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 607.312029] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 607.312029] env[66583]: DEBUG nova.virt.hardware [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 607.312029] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da42106d-6ef1-48e0-a135-1623ffb52158 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.321874] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01bcfcbc-6038-45ed-b69a-1edc661a4a19 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.339232] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Successfully updated port: 73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 607.355369] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.356605] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.356935] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 607.419704] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 607.618645] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 607.619183] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 607.621316] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 607.621316] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 607.621316] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 607.621316] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1faeec67-8692-4140-bccb-eb3b41fecbc5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.643719] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 607.643719] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 607.643719] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6c448f3-232c-43dc-ad69-796d0f69fb58 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.652491] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-153eaa38-c3fb-4197-8018-37eb4e0d5324 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.662955] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Waiting for the task: (returnval){ [ 607.662955] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]520b1670-ba2c-daf7-dfac-c4ff3fa49dc3" [ 607.662955] env[66583]: _type = "Task" [ 607.662955] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 607.679296] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 607.679296] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating directory with path [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 607.679296] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3544d1dd-c675-4d0f-bba3-9910cf327a9a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.707017] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "f5415bfe-3f3a-4f4b-985d-59655791bb2b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.707017] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "f5415bfe-3f3a-4f4b-985d-59655791bb2b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.710027] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Created directory with path [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 607.710027] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Fetch image to [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 607.710027] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 607.710536] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-341c9470-9ddd-419c-9c95-461edd2eb244 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.719785] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b1f4563-9788-4006-9b40-b921914e1d9c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.722827] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 607.737538] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d66d99c8-7963-450c-b194-66c7fa41578d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.781120] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2672638-a7b7-4d3b-b6d9-446759f0ad56 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.793986] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-58145700-2a92-41c8-9675-dc2958bed186 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.804475] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.804726] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.806228] env[66583]: INFO nova.compute.claims [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 607.867696] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "a0bd3693-ed3f-4573-8250-85ae19a08869" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.868119] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "a0bd3693-ed3f-4573-8250-85ae19a08869" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 607.882606] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 607.888560] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 607.940075] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 607.956618] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-854e0949-5dfd-48e7-9f25-9a6363e6afa2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.964382] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-19d6d35c-0aca-45e5-be55-44ad2e462641 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 607.969818] env[66583]: DEBUG oslo_vmware.rw_handles [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 608.052226] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ced07dc1-3969-49ff-a9b9-f78f08bb1c4b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.058064] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Updating instance_info_cache with network_info: [{"id": "73d0c098-473b-4e9f-833c-457d889d94d6", "address": "fa:16:3e:73:fb:3a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.48", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73d0c098-47", "ovs_interfaceid": "73d0c098-473b-4e9f-833c-457d889d94d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 608.060383] env[66583]: DEBUG oslo_vmware.rw_handles [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 608.060588] env[66583]: DEBUG oslo_vmware.rw_handles [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 608.065165] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-131a2d8a-0066-4dcb-92e4-75c047fce93f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.072170] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 608.072408] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Instance network_info: |[{"id": "73d0c098-473b-4e9f-833c-457d889d94d6", "address": "fa:16:3e:73:fb:3a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.48", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73d0c098-47", "ovs_interfaceid": "73d0c098-473b-4e9f-833c-457d889d94d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 608.081706] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:73:fb:3a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73d0c098-473b-4e9f-833c-457d889d94d6', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 608.089293] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating folder: Project (f54e44041809424ba5090e357365305c). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.089771] env[66583]: DEBUG nova.compute.provider_tree [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.090890] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3b892e09-9923-4ca7-a9d7-bca9ce9cf43c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.102672] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created folder: Project (f54e44041809424ba5090e357365305c) in parent group-v693485. [ 608.102813] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating folder: Instances. Parent ref: group-v693489. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.103569] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7989268b-a351-44b6-a2ab-16c8ae091755 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.105815] env[66583]: DEBUG nova.scheduler.client.report [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.118878] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created folder: Instances in parent group-v693489. [ 608.118878] env[66583]: DEBUG oslo.service.loopingcall [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 608.118878] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 608.118878] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-bc6e79af-e9c1-44e8-899d-1a895363742e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.133872] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.134370] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.137065] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.198s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 608.138418] env[66583]: INFO nova.compute.claims [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 608.143016] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 608.143016] env[66583]: value = "task-3470221" [ 608.143016] env[66583]: _type = "Task" [ 608.143016] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.151974] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470221, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.168137] env[66583]: DEBUG nova.compute.utils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.169452] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Not allocating networking since 'none' was specified. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 608.180338] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.278087] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.307341] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.307589] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.307866] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.307937] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.308112] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.308274] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.308503] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.308674] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.308854] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.309029] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.309268] env[66583]: DEBUG nova.virt.hardware [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.310172] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b30b4ce7-481b-4fa5-9ae1-005794effd20 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.325583] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53966b40-ce38-46de-8acd-fbc1dd31388c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.332397] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7a53ae7-d2cf-430b-8cbe-e2f14e9c1d82 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.346762] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Instance VIF info [] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 608.352807] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Creating folder: Project (44004b01f82642a39fdfed80ea0d48eb). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.354650] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-86c1e81e-1a71-4c88-8750-53aa3c590aaf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.362350] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42e86db4-41df-4b0a-a1a2-d59d38e4c91a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.367129] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Created folder: Project (44004b01f82642a39fdfed80ea0d48eb) in parent group-v693485. [ 608.367129] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Creating folder: Instances. Parent ref: group-v693492. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 608.367346] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f4a97ed3-2cd5-44fc-bd84-acc193f736d7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.399561] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fafb8075-849e-43af-8c6e-01ea930b4945 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.407329] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5425c096-0cf4-43d8-b490-12b647f4d2f9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.412933] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Created folder: Instances in parent group-v693492. [ 608.413216] env[66583]: DEBUG oslo.service.loopingcall [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 608.413763] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 608.413985] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-04d98fc0-cb54-4268-93a9-b8e748296387 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.434022] env[66583]: DEBUG nova.compute.provider_tree [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 608.440918] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 608.440918] env[66583]: value = "task-3470224" [ 608.440918] env[66583]: _type = "Task" [ 608.440918] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.444945] env[66583]: DEBUG nova.scheduler.client.report [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 608.450853] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470224, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.461462] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.324s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 608.462012] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 608.496210] env[66583]: DEBUG nova.compute.utils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 608.497588] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 608.497775] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 608.512051] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 608.588152] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 608.620605] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 608.620941] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 608.620977] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 608.621399] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 608.621607] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 608.621888] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 608.622230] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 608.622476] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 608.622873] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 608.623129] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 608.623361] env[66583]: DEBUG nova.virt.hardware [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 608.624688] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2c71766-6559-4534-9494-3795ba3a2e46 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.634185] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a654610-4408-4ea1-b98c-38dec2e98968 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.659818] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470221, 'name': CreateVM_Task, 'duration_secs': 0.293946} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 608.660021] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 608.660698] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 608.660864] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 608.661194] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 608.661775] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8d1d8059-b312-4049-b6c4-675abc277403 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 608.666134] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 608.666134] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]523f51a4-c2f4-f54c-b941-4e024aa25faa" [ 608.666134] env[66583]: _type = "Task" [ 608.666134] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 608.677079] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]523f51a4-c2f4-f54c-b941-4e024aa25faa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 608.708459] env[66583]: DEBUG nova.policy [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'bd2ed28991cb4ba48f6b642b88b0f4c0', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a6c2f60f0f224fec990c5f5b0c90d0ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 608.951924] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470224, 'name': CreateVM_Task, 'duration_secs': 0.263278} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 608.952187] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 608.952819] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.058075] env[66583]: DEBUG nova.compute.manager [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Received event network-vif-plugged-94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 609.058294] env[66583]: DEBUG oslo_concurrency.lockutils [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] Acquiring lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.058502] env[66583]: DEBUG oslo_concurrency.lockutils [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] Lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.058661] env[66583]: DEBUG oslo_concurrency.lockutils [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] Lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.058818] env[66583]: DEBUG nova.compute.manager [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] No waiting events found dispatching network-vif-plugged-94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.058983] env[66583]: WARNING nova.compute.manager [req-685c1199-9d8d-4a12-96cf-0bbe9b2b3158 req-06b967d3-5e45-49c7-84e0-574dc9228efe service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Received unexpected event network-vif-plugged-94295550-570e-4909-8b04-eda0639d4b0f for instance with vm_state building and task_state spawning. [ 609.179266] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.179582] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 609.179956] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.180372] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 609.180628] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 609.180967] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e78b8a7c-e5f1-41c7-8a9e-a10242e723fe {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.186215] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for the task: (returnval){ [ 609.186215] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52decc3e-396f-1fa2-9513-68e2ac155872" [ 609.186215] env[66583]: _type = "Task" [ 609.186215] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 609.195141] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52decc3e-396f-1fa2-9513-68e2ac155872, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 609.418410] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.419027] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.435175] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 609.478098] env[66583]: DEBUG nova.compute.manager [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Received event network-vif-plugged-73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 609.478098] env[66583]: DEBUG oslo_concurrency.lockutils [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] Acquiring lock "fac7d6a8-d74e-4130-8068-236289d5d616-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.478098] env[66583]: DEBUG oslo_concurrency.lockutils [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] Lock "fac7d6a8-d74e-4130-8068-236289d5d616-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.478098] env[66583]: DEBUG oslo_concurrency.lockutils [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] Lock "fac7d6a8-d74e-4130-8068-236289d5d616-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.478300] env[66583]: DEBUG nova.compute.manager [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] No waiting events found dispatching network-vif-plugged-73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 609.478973] env[66583]: WARNING nova.compute.manager [req-168635d6-e390-4029-b624-188305fb917d req-637bf11b-4b8d-41e7-b300-bc350f593993 service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Received unexpected event network-vif-plugged-73d0c098-473b-4e9f-833c-457d889d94d6 for instance with vm_state building and task_state spawning. [ 609.503026] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 609.503268] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 609.505907] env[66583]: INFO nova.compute.claims [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 609.661344] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ba5f8d5-9daf-41e3-9031-5b8c8c4969cc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.670544] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-485495f4-b0c0-4e3f-8e2f-0b7835ec60e2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.709315] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd6a1250-cc9d-4aec-8e74-743411b53716 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.722423] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 609.722673] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 609.722888] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 609.724532] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4251eae1-8b3e-48f3-ae6c-656dcee27e60 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.740042] env[66583]: DEBUG nova.compute.provider_tree [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 609.750407] env[66583]: DEBUG nova.scheduler.client.report [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 609.774376] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 609.774376] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 609.823025] env[66583]: DEBUG nova.compute.utils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 609.823765] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 609.823765] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 609.843543] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 609.938422] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 609.984131] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 609.984567] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 609.984567] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 609.984737] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 609.984885] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 609.985523] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 609.986016] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 609.987014] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 609.987253] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 609.987535] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 609.987666] env[66583]: DEBUG nova.virt.hardware [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 609.989232] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6019591b-e1b3-4d8b-a017-baeff32eae72 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 609.994120] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Successfully created port: e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 610.001729] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55b46e3c-63e2-44b6-9734-e1814835a1aa {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 610.037423] env[66583]: DEBUG nova.policy [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0ecb4668e2e744a88105fdd24c218a7d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e2312f89be49486c9b896d1c925cb928', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 611.971118] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Successfully created port: d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 612.775557] env[66583]: DEBUG nova.compute.manager [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Received event network-changed-94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 612.775557] env[66583]: DEBUG nova.compute.manager [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Refreshing instance network info cache due to event network-changed-94295550-570e-4909-8b04-eda0639d4b0f. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 612.775557] env[66583]: DEBUG oslo_concurrency.lockutils [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] Acquiring lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.775557] env[66583]: DEBUG oslo_concurrency.lockutils [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] Acquired lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.775557] env[66583]: DEBUG nova.network.neutron [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Refreshing network info cache for port 94295550-570e-4909-8b04-eda0639d4b0f {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 612.849100] env[66583]: DEBUG nova.compute.manager [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Received event network-changed-73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 612.851209] env[66583]: DEBUG nova.compute.manager [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Refreshing instance network info cache due to event network-changed-73d0c098-473b-4e9f-833c-457d889d94d6. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 612.851384] env[66583]: DEBUG oslo_concurrency.lockutils [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] Acquiring lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 612.851479] env[66583]: DEBUG oslo_concurrency.lockutils [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] Acquired lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 612.851635] env[66583]: DEBUG nova.network.neutron [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Refreshing network info cache for port 73d0c098-473b-4e9f-833c-457d889d94d6 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 613.120087] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Successfully updated port: e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 613.138328] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 613.138602] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquired lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 613.138663] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 613.467214] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 614.412270] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Updating instance_info_cache with network_info: [{"id": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "address": "fa:16:3e:58:25:a6", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.251", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape053d265-aa", "ovs_interfaceid": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.425284] env[66583]: DEBUG nova.network.neutron [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Updated VIF entry in instance network info cache for port 94295550-570e-4909-8b04-eda0639d4b0f. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 614.425597] env[66583]: DEBUG nova.network.neutron [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Updating instance_info_cache with network_info: [{"id": "94295550-570e-4909-8b04-eda0639d4b0f", "address": "fa:16:3e:df:e6:db", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap94295550-57", "ovs_interfaceid": "94295550-570e-4909-8b04-eda0639d4b0f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.428317] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Releasing lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.428317] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Instance network_info: |[{"id": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "address": "fa:16:3e:58:25:a6", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.251", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape053d265-aa", "ovs_interfaceid": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 614.432021] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:58:25:a6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e053d265-aae9-462c-87ec-6be1fb55eaaa', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 614.442181] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Creating folder: Project (a6c2f60f0f224fec990c5f5b0c90d0ff). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 614.443885] env[66583]: DEBUG nova.network.neutron [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Updated VIF entry in instance network info cache for port 73d0c098-473b-4e9f-833c-457d889d94d6. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 614.445563] env[66583]: DEBUG nova.network.neutron [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Updating instance_info_cache with network_info: [{"id": "73d0c098-473b-4e9f-833c-457d889d94d6", "address": "fa:16:3e:73:fb:3a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.48", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73d0c098-47", "ovs_interfaceid": "73d0c098-473b-4e9f-833c-457d889d94d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 614.445563] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-94787c57-919d-452e-bdaa-9cbae5b74e55 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.448028] env[66583]: DEBUG oslo_concurrency.lockutils [req-15581374-8630-4d9c-9c12-e356dd3c458b req-b54be2a0-17b9-413e-a522-ab4db998b4b9 service nova] Releasing lock "refresh_cache-5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.459894] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Created folder: Project (a6c2f60f0f224fec990c5f5b0c90d0ff) in parent group-v693485. [ 614.460752] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Creating folder: Instances. Parent ref: group-v693495. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 614.460752] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-52642599-63a5-4dee-b0b3-062785145caf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.468336] env[66583]: DEBUG oslo_concurrency.lockutils [req-e02fffc5-0953-4177-af07-c2faa12fa5c9 req-f4c1090e-6ebf-4078-b967-bcbae47230fb service nova] Releasing lock "refresh_cache-fac7d6a8-d74e-4130-8068-236289d5d616" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 614.475764] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Created folder: Instances in parent group-v693495. [ 614.475919] env[66583]: DEBUG oslo.service.loopingcall [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 614.476370] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 614.476707] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-03fbeda1-6325-4158-8ee5-f8ae2d4d7f86 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 614.500109] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 614.500109] env[66583]: value = "task-3470227" [ 614.500109] env[66583]: _type = "Task" [ 614.500109] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 614.516843] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470227, 'name': CreateVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 614.895021] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Successfully updated port: d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 614.904477] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 614.904803] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquired lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 614.905085] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 615.010480] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470227, 'name': CreateVM_Task, 'duration_secs': 0.287116} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 615.010643] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 615.011327] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.011482] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.011803] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 615.012102] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-467369d2-31dc-471f-bbfe-f13106d7209c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.017408] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Waiting for the task: (returnval){ [ 615.017408] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52f2b305-0734-967f-3693-5739773828a3" [ 615.017408] env[66583]: _type = "Task" [ 615.017408] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.023765] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 615.029576] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52f2b305-0734-967f-3693-5739773828a3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.536535] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.536535] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 615.536535] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.687428] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Updating instance_info_cache with network_info: [{"id": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "address": "fa:16:3e:b6:46:5a", "network": {"id": "940d8bd0-2fde-47c7-861c-8339e42db572", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1076660731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e2312f89be49486c9b896d1c925cb928", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5548e7f-4c", "ovs_interfaceid": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 615.701596] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Releasing lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 615.701596] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance network_info: |[{"id": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "address": "fa:16:3e:b6:46:5a", "network": {"id": "940d8bd0-2fde-47c7-861c-8339e42db572", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1076660731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e2312f89be49486c9b896d1c925cb928", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5548e7f-4c", "ovs_interfaceid": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 615.701868] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b6:46:5a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c8ee8640-3787-4c27-9581-962ddb2be7e5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd5548e7f-4cb8-4a79-a167-ab601e2139ce', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 615.713213] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Creating folder: Project (e2312f89be49486c9b896d1c925cb928). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.719053] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-28dfbf76-5129-41c1-b9f3-dc0ef8020ba3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.732836] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Created folder: Project (e2312f89be49486c9b896d1c925cb928) in parent group-v693485. [ 615.733134] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Creating folder: Instances. Parent ref: group-v693498. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 615.733396] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0d1ace28-80bf-4e8d-b80a-73301e03cf24 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.748728] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Created folder: Instances in parent group-v693498. [ 615.748985] env[66583]: DEBUG oslo.service.loopingcall [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 615.749199] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 615.752831] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c59e62de-a86d-4ad1-8f13-45c6b7530c4a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 615.779797] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 615.779797] env[66583]: value = "task-3470230" [ 615.779797] env[66583]: _type = "Task" [ 615.779797] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 615.791187] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470230, 'name': CreateVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 615.793865] env[66583]: DEBUG nova.compute.manager [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Received event network-vif-plugged-d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 615.793983] env[66583]: DEBUG oslo_concurrency.lockutils [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] Acquiring lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.794585] env[66583]: DEBUG oslo_concurrency.lockutils [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.795367] env[66583]: DEBUG oslo_concurrency.lockutils [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.795367] env[66583]: DEBUG nova.compute.manager [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] No waiting events found dispatching network-vif-plugged-d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 615.795367] env[66583]: WARNING nova.compute.manager [req-8164ba49-1b40-4da6-9899-0756debfe662 req-f1e492ca-8d39-4901-8e7c-87e53b6f1112 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Received unexpected event network-vif-plugged-d5548e7f-4cb8-4a79-a167-ab601e2139ce for instance with vm_state building and task_state spawning. [ 615.936418] env[66583]: DEBUG nova.compute.manager [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Received event network-vif-plugged-e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 615.936677] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Acquiring lock "a0bd3693-ed3f-4573-8250-85ae19a08869-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 615.936876] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Lock "a0bd3693-ed3f-4573-8250-85ae19a08869-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 615.939172] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Lock "a0bd3693-ed3f-4573-8250-85ae19a08869-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 615.939422] env[66583]: DEBUG nova.compute.manager [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] No waiting events found dispatching network-vif-plugged-e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 615.939627] env[66583]: WARNING nova.compute.manager [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Received unexpected event network-vif-plugged-e053d265-aae9-462c-87ec-6be1fb55eaaa for instance with vm_state building and task_state spawning. [ 615.939797] env[66583]: DEBUG nova.compute.manager [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Received event network-changed-e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 615.939999] env[66583]: DEBUG nova.compute.manager [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Refreshing instance network info cache due to event network-changed-e053d265-aae9-462c-87ec-6be1fb55eaaa. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 615.940244] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Acquiring lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 615.940391] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Acquired lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 615.940550] env[66583]: DEBUG nova.network.neutron [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Refreshing network info cache for port e053d265-aae9-462c-87ec-6be1fb55eaaa {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 616.093430] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "3816b87a-030d-4362-9596-bd0899455e52" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.093680] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.109595] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 616.179342] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 616.179685] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 616.181212] env[66583]: INFO nova.compute.claims [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 616.295875] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470230, 'name': CreateVM_Task, 'duration_secs': 0.291961} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 616.298518] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 616.299381] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.299937] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 616.299937] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 616.300353] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-384d58ea-23de-42a9-bef9-feaf5ecc8c37 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.307498] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for the task: (returnval){ [ 616.307498] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5276782e-fdbd-acf7-448e-8775ce31962b" [ 616.307498] env[66583]: _type = "Task" [ 616.307498] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 616.317422] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5276782e-fdbd-acf7-448e-8775ce31962b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 616.366336] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-257439b9-2764-434c-a243-fbb540a7afc1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.374659] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7af7e913-3de0-4aab-b81f-2b1566e349b8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.408322] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-760ac13c-a6a7-4de6-8014-9b069a8121ee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.418748] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb89140a-bc0d-431b-82ce-7bd6fd5356c2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.435913] env[66583]: DEBUG nova.compute.provider_tree [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 616.445600] env[66583]: DEBUG nova.scheduler.client.report [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 616.463080] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 616.463625] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 616.504208] env[66583]: DEBUG nova.compute.utils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 616.504748] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 616.504882] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 616.524927] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 616.625338] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 616.657234] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 616.657461] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 616.658751] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 616.658751] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 616.658751] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 616.658972] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 616.659256] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 616.661056] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 616.661056] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 616.661056] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 616.661056] env[66583]: DEBUG nova.virt.hardware [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 616.662749] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7dfeff-ffed-400c-8328-3c4ca3bce137 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.672218] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bb4485b-be55-407d-a3cf-a89ad69de440 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 616.820115] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 616.821096] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 616.821434] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 616.913433] env[66583]: DEBUG nova.policy [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8354785d0fc24568a880c1278ed54774', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e3facfa4c0524c0daa5b43a40dfb1950', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 617.318417] env[66583]: DEBUG nova.network.neutron [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Updated VIF entry in instance network info cache for port e053d265-aae9-462c-87ec-6be1fb55eaaa. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 617.318917] env[66583]: DEBUG nova.network.neutron [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Updating instance_info_cache with network_info: [{"id": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "address": "fa:16:3e:58:25:a6", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.251", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape053d265-aa", "ovs_interfaceid": "e053d265-aae9-462c-87ec-6be1fb55eaaa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 617.332693] env[66583]: DEBUG oslo_concurrency.lockutils [req-e647b8fd-f88e-4e3c-8b2e-30d6dedd9a21 req-18a83328-a70b-4f79-b70f-d96b8244d223 service nova] Releasing lock "refresh_cache-a0bd3693-ed3f-4573-8250-85ae19a08869" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 617.667419] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Successfully created port: 556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 618.855077] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.856121] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.857207] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 618.857475] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 618.886208] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.886396] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.886490] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.886615] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.886736] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.886934] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 618.887028] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 618.887494] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.887739] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.887931] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.888243] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.888449] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.888636] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.888833] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 618.889009] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 618.907554] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.907772] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 618.907963] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 618.909477] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 618.913519] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-777e7e2c-5c49-414a-bc4f-29ecf0e02334 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.927815] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a06428c4-4c95-480b-9c6d-f462c282d602 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.948886] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-346f0e5f-6074-4e88-b1c4-604f0608211f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.955761] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ff4078-5a73-482d-b563-ed07a310b695 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 618.993081] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180949MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 618.993081] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 618.993081] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 619.069397] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.069554] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fac7d6a8-d74e-4130-8068-236289d5d616 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.069683] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance f5415bfe-3f3a-4f4b-985d-59655791bb2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.069803] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a0bd3693-ed3f-4573-8250-85ae19a08869 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.069921] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance da4dff27-123e-44ac-83b5-1b2b3d731e0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.070050] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 3816b87a-030d-4362-9596-bd0899455e52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 619.070293] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 619.070726] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 619.166860] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Successfully updated port: 556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 619.179745] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.180077] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquired lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.180252] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 619.200917] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5738ede7-f88c-4192-b791-a167a87bf9e3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.215298] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8198fda4-5473-43c7-92c7-8034c77f3654 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.253569] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 619.256366] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ca3de80-e5ae-4ad4-a8fa-fadf3e813b7e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.264830] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11b662cb-eb7f-48ff-845a-00f4bb6be9a9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.282072] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 619.289883] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 619.314891] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 619.314891] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.322s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.479642] env[66583]: DEBUG nova.compute.manager [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Received event network-changed-d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 619.480078] env[66583]: DEBUG nova.compute.manager [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Refreshing instance network info cache due to event network-changed-d5548e7f-4cb8-4a79-a167-ab601e2139ce. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 619.480078] env[66583]: DEBUG oslo_concurrency.lockutils [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] Acquiring lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.480548] env[66583]: DEBUG oslo_concurrency.lockutils [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] Acquired lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.480548] env[66583]: DEBUG nova.network.neutron [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Refreshing network info cache for port d5548e7f-4cb8-4a79-a167-ab601e2139ce {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 620.324740] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Updating instance_info_cache with network_info: [{"id": "556170a7-440a-4392-bd7d-31f611c5d0e4", "address": "fa:16:3e:0e:74:4a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap556170a7-44", "ovs_interfaceid": "556170a7-440a-4392-bd7d-31f611c5d0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.337493] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Releasing lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.337493] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance network_info: |[{"id": "556170a7-440a-4392-bd7d-31f611c5d0e4", "address": "fa:16:3e:0e:74:4a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap556170a7-44", "ovs_interfaceid": "556170a7-440a-4392-bd7d-31f611c5d0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 620.337687] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:74:4a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '556170a7-440a-4392-bd7d-31f611c5d0e4', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 620.345406] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Creating folder: Project (e3facfa4c0524c0daa5b43a40dfb1950). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 620.346571] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b2e77d61-71f1-4d48-9b3c-546b338277f3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.360161] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Created folder: Project (e3facfa4c0524c0daa5b43a40dfb1950) in parent group-v693485. [ 620.360161] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Creating folder: Instances. Parent ref: group-v693501. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 620.360161] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4beb2314-50f8-4027-8a60-f492d8554040 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.372197] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Created folder: Instances in parent group-v693501. [ 620.372197] env[66583]: DEBUG oslo.service.loopingcall [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 620.372197] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 620.372197] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3e29cb64-0e2e-45c4-a332-7f906ecaa1e6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.397020] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 620.397020] env[66583]: value = "task-3470233" [ 620.397020] env[66583]: _type = "Task" [ 620.397020] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 620.405129] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470233, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 620.905483] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470233, 'name': CreateVM_Task, 'duration_secs': 0.310463} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 620.905659] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 620.907030] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 620.907030] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 620.907207] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 620.907429] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82db22d0-e0cd-4718-92f7-cd142ee6e04f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.911703] env[66583]: DEBUG nova.network.neutron [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Updated VIF entry in instance network info cache for port d5548e7f-4cb8-4a79-a167-ab601e2139ce. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 620.912052] env[66583]: DEBUG nova.network.neutron [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Updating instance_info_cache with network_info: [{"id": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "address": "fa:16:3e:b6:46:5a", "network": {"id": "940d8bd0-2fde-47c7-861c-8339e42db572", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1076660731-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e2312f89be49486c9b896d1c925cb928", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c8ee8640-3787-4c27-9581-962ddb2be7e5", "external-id": "nsx-vlan-transportzone-224", "segmentation_id": 224, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd5548e7f-4c", "ovs_interfaceid": "d5548e7f-4cb8-4a79-a167-ab601e2139ce", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.915504] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for the task: (returnval){ [ 620.915504] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5206bb38-3d72-4168-4d09-f0a629d78eb5" [ 620.915504] env[66583]: _type = "Task" [ 620.915504] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 620.926315] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5206bb38-3d72-4168-4d09-f0a629d78eb5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 620.928208] env[66583]: DEBUG oslo_concurrency.lockutils [req-e4c6c5be-8f91-4608-a693-dddab55563de req-5e294e8e-e61f-4817-9613-a1c20286cee9 service nova] Releasing lock "refresh_cache-da4dff27-123e-44ac-83b5-1b2b3d731e0a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.432224] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 621.432486] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 621.433085] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 622.082812] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.082812] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.098185] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 622.162747] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 622.165416] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.003s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.167547] env[66583]: INFO nova.compute.claims [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 622.421599] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bb5b933-401b-48af-a90a-363a8b861de8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.431078] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4f6e0fa-cddf-4e25-ba19-be4d976065f1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.464481] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f753b7f-5cc3-4608-b2b2-b6c635a217fd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.472778] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a6b5b49-df16-42c8-a7d1-a4fcb25bd621 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.496039] env[66583]: DEBUG nova.compute.provider_tree [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 622.506415] env[66583]: DEBUG nova.scheduler.client.report [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 622.531881] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.366s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.536558] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 622.583864] env[66583]: DEBUG nova.compute.utils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 622.585220] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 622.585784] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 622.600600] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 622.688139] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 622.724301] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 622.724301] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 622.724301] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 622.724508] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 622.724508] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 622.724508] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 622.724508] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 622.724508] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 622.728482] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 622.728482] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 622.728482] env[66583]: DEBUG nova.virt.hardware [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 622.728482] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b3d0430-b473-4d8f-9d80-5477077addc1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 622.738370] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3ae4e54-b883-4460-80b7-b4d7fc55179c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 623.080416] env[66583]: DEBUG nova.compute.manager [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Received event network-vif-plugged-556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 623.080667] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Acquiring lock "3816b87a-030d-4362-9596-bd0899455e52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 623.080905] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Lock "3816b87a-030d-4362-9596-bd0899455e52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 623.081086] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Lock "3816b87a-030d-4362-9596-bd0899455e52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 623.081306] env[66583]: DEBUG nova.compute.manager [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] No waiting events found dispatching network-vif-plugged-556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 623.081411] env[66583]: WARNING nova.compute.manager [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Received unexpected event network-vif-plugged-556170a7-440a-4392-bd7d-31f611c5d0e4 for instance with vm_state building and task_state spawning. [ 623.082134] env[66583]: DEBUG nova.compute.manager [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Received event network-changed-556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 623.082134] env[66583]: DEBUG nova.compute.manager [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Refreshing instance network info cache due to event network-changed-556170a7-440a-4392-bd7d-31f611c5d0e4. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 623.082134] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Acquiring lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 623.082134] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Acquired lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 623.083499] env[66583]: DEBUG nova.network.neutron [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Refreshing network info cache for port 556170a7-440a-4392-bd7d-31f611c5d0e4 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 623.102835] env[66583]: DEBUG nova.policy [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3905566ae8314d40a601efb54a37ad26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '820b7d141569446ead1901b8442f8184', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 624.816904] env[66583]: DEBUG nova.network.neutron [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Updated VIF entry in instance network info cache for port 556170a7-440a-4392-bd7d-31f611c5d0e4. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 624.817316] env[66583]: DEBUG nova.network.neutron [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Updating instance_info_cache with network_info: [{"id": "556170a7-440a-4392-bd7d-31f611c5d0e4", "address": "fa:16:3e:0e:74:4a", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap556170a7-44", "ovs_interfaceid": "556170a7-440a-4392-bd7d-31f611c5d0e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 624.827615] env[66583]: DEBUG oslo_concurrency.lockutils [req-0a6c1cdf-a3b2-4b1e-a0d7-476fafce42dd req-73d48d7f-46cc-444c-a489-d802cdb94c57 service nova] Releasing lock "refresh_cache-3816b87a-030d-4362-9596-bd0899455e52" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 625.153719] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "89e32d26-aa13-4b13-9aec-9e35513946e8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.154282] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.177099] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 625.185989] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Successfully created port: 458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 625.244673] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 625.244923] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 625.250018] env[66583]: INFO nova.compute.claims [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 625.511018] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d588fee9-dab0-4538-b3e3-27e8dcd61774 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.524871] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9eb6dd2-2ffa-44d5-beb7-0e36d5de7004 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.565059] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bd0999b-47ae-49b9-9fa4-64547ec2b3c6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.579700] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d73b13a-738d-47f2-a4f3-13e2d9248c02 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.597291] env[66583]: DEBUG nova.compute.provider_tree [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 625.612110] env[66583]: DEBUG nova.scheduler.client.report [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 625.634042] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.387s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 625.634042] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 625.679367] env[66583]: DEBUG nova.compute.utils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 625.686474] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 625.686474] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 625.699281] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 625.802458] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 625.850978] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 625.852598] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 625.852775] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 625.852979] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 625.853137] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 625.853305] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 625.854387] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 625.854567] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 625.854741] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 625.854907] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 625.855093] env[66583]: DEBUG nova.virt.hardware [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 625.856911] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da04543c-d0d5-48a8-a4e6-86796d68a6b2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 625.875092] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d9e46ab-2250-4bae-95b1-9cb17a046567 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 626.079952] env[66583]: DEBUG nova.policy [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0948a9987ddf418a85a3a36c400c1c50', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'de382616205d42ecbac319f959682cae', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 626.741846] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.742165] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.753507] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 626.822567] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 626.822567] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 626.824036] env[66583]: INFO nova.compute.claims [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 627.036600] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b7043e7-b3f6-4f8c-9cf0-161bbbbe2df7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.045696] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-872b1bb0-2de3-441c-82d7-45be4e9cc393 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.084180] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71413245-e2f7-4bf6-ad62-0f668c892c69 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.092292] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-814fddfa-92a4-4a3a-b92f-ccac0878ad0d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.106793] env[66583]: DEBUG nova.compute.provider_tree [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 627.115925] env[66583]: DEBUG nova.scheduler.client.report [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 627.137681] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.315s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 627.138338] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 627.178332] env[66583]: DEBUG nova.compute.utils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 627.182289] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 627.182471] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 627.197672] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 627.275731] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 627.307774] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 627.307774] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 627.307774] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 627.308075] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 627.308075] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 627.308075] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 627.308075] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 627.308470] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 627.308598] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 627.308981] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 627.309638] env[66583]: DEBUG nova.virt.hardware [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 627.310565] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67a8495d-b181-47b7-87f0-b13caf8f557e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.319448] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc891f11-777e-4fe8-ba06-180344946627 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 627.635871] env[66583]: DEBUG nova.policy [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54ffaf6f2a6441aba0aa66ec482633d4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '444481e013d84d97a9bd3a86514b4c5c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 628.721403] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Successfully created port: 2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 629.942281] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Successfully updated port: 458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 629.960526] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 629.960526] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 629.961272] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 630.032057] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Successfully created port: 81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 630.123161] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.161394] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Updating instance_info_cache with network_info: [{"id": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "address": "fa:16:3e:55:5e:eb", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap458d9354-fe", "ovs_interfaceid": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 631.182626] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 631.183090] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance network_info: |[{"id": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "address": "fa:16:3e:55:5e:eb", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap458d9354-fe", "ovs_interfaceid": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 631.185016] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:55:5e:eb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae4e3171-21cd-4094-b6cf-81bf366c75bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '458d9354-fee6-403f-b8dd-ddd98a5fd11a', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 631.193921] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating folder: Project (820b7d141569446ead1901b8442f8184). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.195318] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-93f3b730-d047-4d73-bdbb-aa991254ef01 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.207682] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created folder: Project (820b7d141569446ead1901b8442f8184) in parent group-v693485. [ 631.207896] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating folder: Instances. Parent ref: group-v693504. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 631.208431] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-129511de-7ce5-4e9f-b374-30b249372839 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.218471] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created folder: Instances in parent group-v693504. [ 631.218946] env[66583]: DEBUG oslo.service.loopingcall [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 631.219345] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 631.219486] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6f2ae22a-7d6d-40e7-9a29-74aee5a378c6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.242791] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 631.242791] env[66583]: value = "task-3470236" [ 631.242791] env[66583]: _type = "Task" [ 631.242791] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.253044] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470236, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.538224] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Successfully updated port: 2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 631.546216] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.546357] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquired lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.546499] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 631.632680] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 631.757758] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470236, 'name': CreateVM_Task, 'duration_secs': 0.276592} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 631.757935] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 631.758708] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 631.758895] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 631.759219] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 631.759484] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3e0c55ce-42be-48fc-b379-cb1023209dfc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 631.766655] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 631.766655] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]521685d1-81f9-25ea-6994-a0eed43efe87" [ 631.766655] env[66583]: _type = "Task" [ 631.766655] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 631.776793] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]521685d1-81f9-25ea-6994-a0eed43efe87, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 631.899814] env[66583]: DEBUG nova.compute.manager [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Received event network-vif-plugged-458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 631.900300] env[66583]: DEBUG oslo_concurrency.lockutils [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] Acquiring lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 631.900503] env[66583]: DEBUG oslo_concurrency.lockutils [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 631.900649] env[66583]: DEBUG oslo_concurrency.lockutils [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 631.900809] env[66583]: DEBUG nova.compute.manager [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] No waiting events found dispatching network-vif-plugged-458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 631.900988] env[66583]: WARNING nova.compute.manager [req-bdb114f6-d5a5-4038-bf06-9a9b9beab3ec req-21512533-aac8-4082-a95e-af2d315125e9 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Received unexpected event network-vif-plugged-458d9354-fee6-403f-b8dd-ddd98a5fd11a for instance with vm_state building and task_state spawning. [ 632.280112] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.280737] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 632.281091] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.404052] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Updating instance_info_cache with network_info: [{"id": "2bea3604-3574-4103-a66d-e2617545e10c", "address": "fa:16:3e:51:ea:71", "network": {"id": "a179ca9c-73e5-4073-bb82-edca4dc17353", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1985690922-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de382616205d42ecbac319f959682cae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "309d7cfa-b4da-4eec-9f4b-2e10d215fac7", "external-id": "nsx-vlan-transportzone-285", "segmentation_id": 285, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2bea3604-35", "ovs_interfaceid": "2bea3604-3574-4103-a66d-e2617545e10c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 632.423742] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Releasing lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 632.424076] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance network_info: |[{"id": "2bea3604-3574-4103-a66d-e2617545e10c", "address": "fa:16:3e:51:ea:71", "network": {"id": "a179ca9c-73e5-4073-bb82-edca4dc17353", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1985690922-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de382616205d42ecbac319f959682cae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "309d7cfa-b4da-4eec-9f4b-2e10d215fac7", "external-id": "nsx-vlan-transportzone-285", "segmentation_id": 285, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2bea3604-35", "ovs_interfaceid": "2bea3604-3574-4103-a66d-e2617545e10c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 632.424452] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:51:ea:71', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '309d7cfa-b4da-4eec-9f4b-2e10d215fac7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2bea3604-3574-4103-a66d-e2617545e10c', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 632.437561] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Creating folder: Project (de382616205d42ecbac319f959682cae). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.438308] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c9bb3cc9-7be1-44a1-8773-009d3f8d5a77 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.450693] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Created folder: Project (de382616205d42ecbac319f959682cae) in parent group-v693485. [ 632.450885] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Creating folder: Instances. Parent ref: group-v693507. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 632.451170] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b6c2d5e-a4fe-41de-9076-f9da52eb8e9a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.461464] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Created folder: Instances in parent group-v693507. [ 632.463997] env[66583]: DEBUG oslo.service.loopingcall [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 632.464666] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 632.464666] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-94bfdaff-00db-4ef1-8f5a-c85bc50f004e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 632.496594] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 632.496594] env[66583]: value = "task-3470239" [ 632.496594] env[66583]: _type = "Task" [ 632.496594] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 632.505643] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470239, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 632.635521] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Successfully updated port: 81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 632.651687] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 632.651894] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquired lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 632.651987] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 632.746089] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 633.009784] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470239, 'name': CreateVM_Task, 'duration_secs': 0.287068} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 633.009965] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 633.011264] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.011436] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 633.011744] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 633.012016] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c6c86bd6-4f10-4242-9bd8-67d2b0ea2a76 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.017160] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for the task: (returnval){ [ 633.017160] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5272a37a-c707-d202-ae8d-9ced22011805" [ 633.017160] env[66583]: _type = "Task" [ 633.017160] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.027256] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5272a37a-c707-d202-ae8d-9ced22011805, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.191153] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Updating instance_info_cache with network_info: [{"id": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "address": "fa:16:3e:de:57:51", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81de94c4-0e", "ovs_interfaceid": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 633.206681] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Releasing lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.207063] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance network_info: |[{"id": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "address": "fa:16:3e:de:57:51", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81de94c4-0e", "ovs_interfaceid": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 633.207393] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:57:51', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '81de94c4-0e5d-49ab-b531-8df7bdf14647', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 633.218638] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Creating folder: Project (444481e013d84d97a9bd3a86514b4c5c). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.220026] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c9972e32-b29e-4896-baec-0e1e631f1504 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.234974] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Created folder: Project (444481e013d84d97a9bd3a86514b4c5c) in parent group-v693485. [ 633.235180] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Creating folder: Instances. Parent ref: group-v693510. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 633.235664] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6cbde23a-a302-4781-bdd3-89e2f4ef99a0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.251571] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Created folder: Instances in parent group-v693510. [ 633.251823] env[66583]: DEBUG oslo.service.loopingcall [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 633.252362] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 633.252715] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1b56de5d-de62-430e-b125-0ab0e730a63b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 633.279533] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 633.279533] env[66583]: value = "task-3470242" [ 633.279533] env[66583]: _type = "Task" [ 633.279533] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 633.288369] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470242, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 633.528973] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 633.529248] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 633.529457] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 633.790388] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470242, 'name': CreateVM_Task} progress is 99%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 634.290852] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470242, 'name': CreateVM_Task} progress is 99%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 634.792218] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470242, 'name': CreateVM_Task, 'duration_secs': 1.262847} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 634.792792] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 634.796024] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 634.796024] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 634.796024] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 634.796024] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52213993-712a-413c-8ff1-71323230e23a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 634.803988] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Waiting for the task: (returnval){ [ 634.803988] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52991443-bbf1-f77e-7b20-59e3c2218386" [ 634.803988] env[66583]: _type = "Task" [ 634.803988] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 634.813115] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52991443-bbf1-f77e-7b20-59e3c2218386, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 635.318663] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 635.318926] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 635.319163] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.909640] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Received event network-changed-458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 635.909640] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Refreshing instance network info cache due to event network-changed-458d9354-fee6-403f-b8dd-ddd98a5fd11a. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 635.909775] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquiring lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 635.909911] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquired lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 635.911180] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Refreshing network info cache for port 458d9354-fee6-403f-b8dd-ddd98a5fd11a {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 636.354451] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Updated VIF entry in instance network info cache for port 458d9354-fee6-403f-b8dd-ddd98a5fd11a. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 636.354734] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Updating instance_info_cache with network_info: [{"id": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "address": "fa:16:3e:55:5e:eb", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap458d9354-fe", "ovs_interfaceid": "458d9354-fee6-403f-b8dd-ddd98a5fd11a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 636.367296] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Releasing lock "refresh_cache-fce1b601-0363-4447-b802-3ea5d3aa97a0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 636.367621] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Received event network-vif-plugged-2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 636.367824] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquiring lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 636.368550] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 636.368550] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 636.368550] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] No waiting events found dispatching network-vif-plugged-2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 636.368550] env[66583]: WARNING nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Received unexpected event network-vif-plugged-2bea3604-3574-4103-a66d-e2617545e10c for instance with vm_state building and task_state spawning. [ 636.368771] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Received event network-changed-2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 636.368841] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Refreshing instance network info cache due to event network-changed-2bea3604-3574-4103-a66d-e2617545e10c. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 636.369788] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquiring lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 636.369950] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquired lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 636.370148] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Refreshing network info cache for port 2bea3604-3574-4103-a66d-e2617545e10c {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 637.019944] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Updated VIF entry in instance network info cache for port 2bea3604-3574-4103-a66d-e2617545e10c. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 637.020253] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Updating instance_info_cache with network_info: [{"id": "2bea3604-3574-4103-a66d-e2617545e10c", "address": "fa:16:3e:51:ea:71", "network": {"id": "a179ca9c-73e5-4073-bb82-edca4dc17353", "bridge": "br-int", "label": "tempest-ServersTestFqdnHostnames-1985690922-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "de382616205d42ecbac319f959682cae", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "309d7cfa-b4da-4eec-9f4b-2e10d215fac7", "external-id": "nsx-vlan-transportzone-285", "segmentation_id": 285, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2bea3604-35", "ovs_interfaceid": "2bea3604-3574-4103-a66d-e2617545e10c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.036108] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Releasing lock "refresh_cache-89e32d26-aa13-4b13-9aec-9e35513946e8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 637.036108] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Received event network-vif-plugged-81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 637.036108] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquiring lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 637.036108] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 637.036308] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 637.036308] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] No waiting events found dispatching network-vif-plugged-81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 637.036308] env[66583]: WARNING nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Received unexpected event network-vif-plugged-81de94c4-0e5d-49ab-b531-8df7bdf14647 for instance with vm_state building and task_state spawning. [ 637.036396] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Received event network-changed-81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 637.036497] env[66583]: DEBUG nova.compute.manager [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Refreshing instance network info cache due to event network-changed-81de94c4-0e5d-49ab-b531-8df7bdf14647. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 637.036850] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquiring lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 637.036850] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Acquired lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 637.036950] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Refreshing network info cache for port 81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 637.476594] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Updated VIF entry in instance network info cache for port 81de94c4-0e5d-49ab-b531-8df7bdf14647. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 637.477281] env[66583]: DEBUG nova.network.neutron [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Updating instance_info_cache with network_info: [{"id": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "address": "fa:16:3e:de:57:51", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.69", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81de94c4-0e", "ovs_interfaceid": "81de94c4-0e5d-49ab-b531-8df7bdf14647", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 637.495480] env[66583]: DEBUG oslo_concurrency.lockutils [req-640b854c-385d-45b8-9743-4a4336e9b10f req-98c5e2f3-fc36-4aeb-a80e-23f0c7f81c76 service nova] Releasing lock "refresh_cache-8c6830c9-f8e4-4c72-892c-3012cd9b84c0" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 643.346125] env[66583]: DEBUG nova.compute.manager [req-25a8ba97-6639-49c6-a3a6-1a67fda0db9e req-dca514d1-1e4b-4788-be21-d28aac8eb92a service nova] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Received event network-vif-deleted-81de94c4-0e5d-49ab-b531-8df7bdf14647 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 657.127342] env[66583]: WARNING oslo_vmware.rw_handles [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 657.127342] env[66583]: ERROR oslo_vmware.rw_handles [ 657.128029] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 657.129454] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 657.129518] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Copying Virtual Disk [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/ff40750c-dc41-46cd-8997-c6387018892f/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 657.129776] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-1c3d2460-11aa-448f-a556-4283b78e0b3f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.138220] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Waiting for the task: (returnval){ [ 657.138220] env[66583]: value = "task-3470243" [ 657.138220] env[66583]: _type = "Task" [ 657.138220] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 657.146868] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Task: {'id': task-3470243, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 657.650152] env[66583]: DEBUG oslo_vmware.exceptions [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 657.650497] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 657.654140] env[66583]: ERROR nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.654140] env[66583]: Faults: ['InvalidArgument'] [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Traceback (most recent call last): [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] yield resources [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self.driver.spawn(context, instance, image_meta, [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self._fetch_image_if_missing(context, vi) [ 657.654140] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] image_cache(vi, tmp_image_ds_loc) [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] vm_util.copy_virtual_disk( [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] session._wait_for_task(vmdk_copy_task) [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return self.wait_for_task(task_ref) [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return evt.wait() [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] result = hub.switch() [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 657.654566] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return self.greenlet.switch() [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self.f(*self.args, **self.kw) [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] raise exceptions.translate_fault(task_info.error) [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Faults: ['InvalidArgument'] [ 657.654921] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] [ 657.654921] env[66583]: INFO nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Terminating instance [ 657.656241] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 657.656436] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 657.657102] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 657.657289] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 657.657510] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ab464d8-c5b1-4d3c-b235-5d5e93613a80 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.660108] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3826a165-4931-46c7-9f46-f5c73f565d13 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.669227] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 657.670499] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e4991e3e-69c1-4d4d-8280-59e09e5bd569 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.672062] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 657.672258] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 657.672972] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-08e60317-c143-455b-9e29-2ac34886b746 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.679158] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 657.679158] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]522162f7-3b84-d622-7431-ee34e6811737" [ 657.679158] env[66583]: _type = "Task" [ 657.679158] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 657.686150] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]522162f7-3b84-d622-7431-ee34e6811737, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 657.745067] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 657.745307] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 657.745495] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Deleting the datastore file [datastore2] 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 657.745758] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ae896fa8-9d28-462b-92cc-0c7dec8e5698 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.752675] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Waiting for the task: (returnval){ [ 657.752675] env[66583]: value = "task-3470245" [ 657.752675] env[66583]: _type = "Task" [ 657.752675] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 657.762486] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Task: {'id': task-3470245, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 658.190119] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 658.190398] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating directory with path [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 658.190631] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e01b5ccb-a803-46f3-8647-7a8152836cc9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.202481] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created directory with path [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 658.202689] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Fetch image to [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 658.202862] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 658.204468] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07817f7a-6239-4818-a110-687f6694f289 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.225338] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d09201b5-c2e8-4c1d-9074-2b33ee18a1c7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.232869] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8681e910-0b90-4e7f-a2c8-3d36192774dc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.272100] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cdc6a74-4471-415e-82e3-302be48afae8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.281754] env[66583]: DEBUG oslo_vmware.api [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Task: {'id': task-3470245, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077548} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 658.281921] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 658.282280] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 658.282464] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 658.286064] env[66583]: INFO nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Took 0.63 seconds to destroy the instance on the hypervisor. [ 658.286064] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-87581e6a-145c-4da5-a746-bfe2ef6e095a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.287436] env[66583]: DEBUG nova.compute.claims [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 658.287563] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 658.287734] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 658.370132] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 658.444195] env[66583]: DEBUG oslo_vmware.rw_handles [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 658.513336] env[66583]: DEBUG oslo_vmware.rw_handles [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 658.513566] env[66583]: DEBUG oslo_vmware.rw_handles [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 658.543146] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe5edf06-6278-44d3-a90a-318654c73955 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.552182] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1810002-3874-4602-a0dc-0f0dc6710694 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.589754] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fe5970c-079e-4b69-a09d-3241dd57d737 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.595608] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ccf7bdd-adad-4740-8952-119cf15fcbfe {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.610883] env[66583]: DEBUG nova.compute.provider_tree [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 658.625558] env[66583]: DEBUG nova.scheduler.client.report [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 658.641651] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.354s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 658.642224] env[66583]: ERROR nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 658.642224] env[66583]: Faults: ['InvalidArgument'] [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Traceback (most recent call last): [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self.driver.spawn(context, instance, image_meta, [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self._fetch_image_if_missing(context, vi) [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] image_cache(vi, tmp_image_ds_loc) [ 658.642224] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] vm_util.copy_virtual_disk( [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] session._wait_for_task(vmdk_copy_task) [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return self.wait_for_task(task_ref) [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return evt.wait() [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] result = hub.switch() [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] return self.greenlet.switch() [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 658.642964] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] self.f(*self.args, **self.kw) [ 658.643506] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 658.643506] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] raise exceptions.translate_fault(task_info.error) [ 658.643506] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 658.643506] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Faults: ['InvalidArgument'] [ 658.643506] env[66583]: ERROR nova.compute.manager [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] [ 658.643506] env[66583]: DEBUG nova.compute.utils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 658.647597] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Build of instance 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e was re-scheduled: A specified parameter was not correct: fileType [ 658.647597] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 658.649022] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 658.649022] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 658.649022] env[66583]: DEBUG nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 658.649022] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 660.519140] env[66583]: DEBUG nova.network.neutron [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 660.537123] env[66583]: INFO nova.compute.manager [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] [instance: 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e] Took 1.89 seconds to deallocate network for instance. [ 660.661217] env[66583]: INFO nova.scheduler.client.report [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Deleted allocations for instance 5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e [ 660.684915] env[66583]: DEBUG oslo_concurrency.lockutils [None req-52395fe5-2073-4311-8c39-57763c94edf3 tempest-ServerExternalEventsTest-1404717804 tempest-ServerExternalEventsTest-1404717804-project-member] Lock "5e2ed48b-9cc5-4f62-b5a7-9f161b5a6a4e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 61.094s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.299879] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.330990] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.846575] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.846902] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.848132] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.848132] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 679.848132] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 679.863687] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.863917] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.864097] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 679.864252] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 679.866968] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20aec7b7-8329-407a-8412-1d0ade4b5942 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.878464] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30a21683-ba93-43a9-b540-13fcc3088c34 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.894784] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30d8e0b-213b-4281-8b04-85943e2922e9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.902124] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ab8eaf5-df6c-4d67-99ab-49fa0a5547b7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.939424] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180943MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 679.939580] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.940213] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.039498] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fac7d6a8-d74e-4130-8068-236289d5d616 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.039720] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance f5415bfe-3f3a-4f4b-985d-59655791bb2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.039774] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a0bd3693-ed3f-4573-8250-85ae19a08869 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.039886] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance da4dff27-123e-44ac-83b5-1b2b3d731e0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.040070] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 3816b87a-030d-4362-9596-bd0899455e52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.040126] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.040239] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 680.040490] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 680.040577] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 680.165386] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a37ff14-dc44-46f0-8b4f-ce49007463be {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.176195] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ab0b0af-ff7d-43c1-b4b6-f7e91e08ebe7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.218180] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10512dcf-ab2f-4884-bee2-5c14d8e8637a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.226527] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-238604c8-f79d-4dde-a049-7a51db533153 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.243406] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 680.251980] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 680.271870] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 680.271870] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.332s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.267015] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 681.267589] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 681.267589] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 681.267589] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 681.284369] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285154] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285154] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285154] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285154] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285154] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285478] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 681.285478] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 681.285769] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 681.285865] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 688.986017] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 688.986506] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.000925] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 689.062709] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.063012] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.064515] env[66583]: INFO nova.compute.claims [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 689.281338] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00813736-6dc4-493d-87d8-549e095bf3d7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.293131] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-572cbe9f-714f-4ca5-99d8-f5a2b860513b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.332325] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340e1503-7f1d-41c1-b524-2106c6c6dba6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.340755] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526f6d82-237e-483e-a251-d5ee04ba9143 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.354428] env[66583]: DEBUG nova.compute.provider_tree [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 689.366467] env[66583]: DEBUG nova.scheduler.client.report [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 689.381222] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.318s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 689.381758] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 689.422561] env[66583]: DEBUG nova.compute.utils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 689.424872] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 689.426320] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 689.441139] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 689.530908] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 689.562086] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 689.562387] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 689.562616] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 689.562723] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 689.562902] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 689.564016] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 689.565705] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 689.566807] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 689.566807] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 689.566807] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 689.566807] env[66583]: DEBUG nova.virt.hardware [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 689.567604] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7979387e-6fd0-4712-9a2d-aaa1a18b7cfa {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.576045] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fd74c35-ce3f-4c0c-9651-27a0ce4ec670 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.700844] env[66583]: DEBUG nova.policy [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b3ef220b045d4dde98a3d54369c46e2c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a067f662226f4250a146d993b8732e79', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 689.743809] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "a14dfb60-e62a-4a74-9f5b-f031814c609e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.744150] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "a14dfb60-e62a-4a74-9f5b-f031814c609e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.756440] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 689.818994] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 689.820029] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 689.821459] env[66583]: INFO nova.compute.claims [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 690.047997] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8ca6d75-7499-49cf-83da-00037637478c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.058755] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cca925f-8f7f-499a-8f6c-7dea95f26374 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.096581] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2883c44e-66e3-4dfc-bb79-5f3901b46232 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.104290] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10db3572-dabd-46ec-927c-6f798aeda38c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.121941] env[66583]: DEBUG nova.compute.provider_tree [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 690.132125] env[66583]: DEBUG nova.scheduler.client.report [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 690.148123] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.329s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 690.148809] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 690.196659] env[66583]: DEBUG nova.compute.utils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 690.198087] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 690.199057] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 690.209130] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 690.284383] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 690.317108] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 690.317108] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 690.317108] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 690.317297] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 690.317297] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 690.317297] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 690.317297] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 690.317297] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 690.317756] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 690.317756] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 690.317756] env[66583]: DEBUG nova.virt.hardware [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 690.318031] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e636bcd-7bef-43ad-9822-046fb6562749 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.327064] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42184f1a-17ce-4f4b-9fdf-720fcb78134f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.601534] env[66583]: DEBUG nova.policy [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '84234db699c24121a8fe158ab52afa0e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6f5002b79bc247d8ba9a54f5bdeacfec', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 691.070213] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.071437] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.355959] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "12bc9e29-ecea-40e9-af34-a067f3d2301f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.357050] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "12bc9e29-ecea-40e9-af34-a067f3d2301f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.815885] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Successfully created port: 0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 692.611197] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Successfully created port: 0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 695.135178] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "6deed686-ceca-45a1-b8e4-2461b2e3f039" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.135178] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "6deed686-ceca-45a1-b8e4-2461b2e3f039" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.521421] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Successfully updated port: 0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 695.536099] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 695.536225] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquired lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 695.536373] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 695.655989] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 696.607618] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Updating instance_info_cache with network_info: [{"id": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "address": "fa:16:3e:23:28:ac", "network": {"id": "4fabb88d-b4d0-4f5e-b8d0-85f319db1cd5", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1392379790-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a067f662226f4250a146d993b8732e79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fc8bf3d-1b", "ovs_interfaceid": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 696.621858] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Releasing lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 696.622196] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Instance network_info: |[{"id": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "address": "fa:16:3e:23:28:ac", "network": {"id": "4fabb88d-b4d0-4f5e-b8d0-85f319db1cd5", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1392379790-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a067f662226f4250a146d993b8732e79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fc8bf3d-1b", "ovs_interfaceid": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 696.622580] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:23:28:ac', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '49b5df12-d801-4140-8816-2fd401608c7d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0fc8bf3d-1bab-4f7d-8a32-d78076f86912', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 696.631540] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Creating folder: Project (a067f662226f4250a146d993b8732e79). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.631540] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1906531f-2338-4a0b-8412-258a8640b8b1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.641587] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Created folder: Project (a067f662226f4250a146d993b8732e79) in parent group-v693485. [ 696.641796] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Creating folder: Instances. Parent ref: group-v693520. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 696.642036] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-415b26db-047b-4636-a668-bb04b148d3bd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.655121] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Created folder: Instances in parent group-v693520. [ 696.655121] env[66583]: DEBUG oslo.service.loopingcall [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 696.655121] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 696.655121] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a935b7f5-9181-4af0-b2c0-eb92a9006129 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 696.668983] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Successfully updated port: 0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 696.675406] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 696.675406] env[66583]: value = "task-3470262" [ 696.675406] env[66583]: _type = "Task" [ 696.675406] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 696.684559] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470262, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 696.686320] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 696.686320] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquired lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 696.686320] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 696.805189] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 696.933790] env[66583]: DEBUG nova.compute.manager [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Received event network-vif-plugged-0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 696.934125] env[66583]: DEBUG oslo_concurrency.lockutils [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] Acquiring lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.934246] env[66583]: DEBUG oslo_concurrency.lockutils [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] Lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.934405] env[66583]: DEBUG oslo_concurrency.lockutils [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] Lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 696.934566] env[66583]: DEBUG nova.compute.manager [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] No waiting events found dispatching network-vif-plugged-0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 696.934721] env[66583]: WARNING nova.compute.manager [req-595cb18f-f69e-4f0a-b6da-953b829a22a7 req-a5291507-ee95-454b-bd0d-4ae9624d73c3 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Received unexpected event network-vif-plugged-0fc8bf3d-1bab-4f7d-8a32-d78076f86912 for instance with vm_state building and task_state spawning. [ 697.185862] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470262, 'name': CreateVM_Task, 'duration_secs': 0.292226} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 697.190772] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 697.192541] env[66583]: DEBUG oslo_vmware.service [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a498145a-3db3-48d2-a8c3-47148de0653d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.201244] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 697.201409] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 697.202120] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 697.202978] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c015f56-4880-4501-a31b-7ac290ccbfe2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.207601] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Waiting for the task: (returnval){ [ 697.207601] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]527d228b-8414-7c52-1c17-8faa3fc0d0f6" [ 697.207601] env[66583]: _type = "Task" [ 697.207601] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 697.217499] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]527d228b-8414-7c52-1c17-8faa3fc0d0f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 697.364401] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Updating instance_info_cache with network_info: [{"id": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "address": "fa:16:3e:53:64:68", "network": {"id": "100e8d9d-d3ca-419b-9d79-3cb940cd4a67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1959654231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6f5002b79bc247d8ba9a54f5bdeacfec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b83383f-ed7a-4efd-aef7-aa8c15649d07", "external-id": "nsx-vlan-transportzone-282", "segmentation_id": 282, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0da63175-e8", "ovs_interfaceid": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 697.379506] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Releasing lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 697.380817] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Instance network_info: |[{"id": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "address": "fa:16:3e:53:64:68", "network": {"id": "100e8d9d-d3ca-419b-9d79-3cb940cd4a67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1959654231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6f5002b79bc247d8ba9a54f5bdeacfec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b83383f-ed7a-4efd-aef7-aa8c15649d07", "external-id": "nsx-vlan-transportzone-282", "segmentation_id": 282, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0da63175-e8", "ovs_interfaceid": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 697.383014] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:53:64:68', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7b83383f-ed7a-4efd-aef7-aa8c15649d07', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0da63175-e8f6-4a88-89b8-e0a9f1de84e8', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 697.392368] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Creating folder: Project (6f5002b79bc247d8ba9a54f5bdeacfec). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 697.393041] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-059b18c6-5961-4e59-af9a-cc6e095dd086 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.408847] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Created folder: Project (6f5002b79bc247d8ba9a54f5bdeacfec) in parent group-v693485. [ 697.408847] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Creating folder: Instances. Parent ref: group-v693523. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 697.408847] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f102f61a-71be-4605-8eda-129f34558085 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.420034] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Created folder: Instances in parent group-v693523. [ 697.420313] env[66583]: DEBUG oslo.service.loopingcall [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 697.420467] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 697.420640] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8cc38c42-8587-498f-9110-be03a5d14ac4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.441099] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 697.441099] env[66583]: value = "task-3470265" [ 697.441099] env[66583]: _type = "Task" [ 697.441099] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 697.449759] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470265, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 697.723751] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 697.724171] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 697.724294] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 697.724451] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 697.724631] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 697.725289] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-eb0ecf4d-0402-494d-937e-96fb445e1f7f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.746486] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 697.746665] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 697.747584] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f3a04f6-49e9-4e0a-866a-9f8a4316ce36 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.761520] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4847209d-0445-4f33-83aa-44e98507ef4c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 697.766017] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Waiting for the task: (returnval){ [ 697.766017] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52fa20ce-e1c8-3076-0a4f-f09146030b65" [ 697.766017] env[66583]: _type = "Task" [ 697.766017] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 697.777023] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52fa20ce-e1c8-3076-0a4f-f09146030b65, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 697.952831] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470265, 'name': CreateVM_Task} progress is 99%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 698.278956] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 698.281375] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Creating directory with path [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 698.281657] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0169802e-c57d-4b64-b93f-0f44b284010e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.319784] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Created directory with path [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 698.320060] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Fetch image to [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 698.320183] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 698.320991] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-502f3336-9e50-4b34-a927-af52a5800a4c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.332399] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bdd3ef2-0b09-4291-be7b-96372ca7eb30 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.353251] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32946127-6857-4ff0-b6b6-05f2dcc33bd0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.393046] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5428c494-94e5-4ca0-9fdd-423a4b3429f2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.400668] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-4c98238c-731b-4de6-92ad-7dbe499903bf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.456801] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470265, 'name': CreateVM_Task} progress is 99%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 698.482017] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "9915557d-4251-44a2-bf59-3dd542dfb527" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.483428] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "9915557d-4251-44a2-bf59-3dd542dfb527" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.498214] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 698.543704] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 698.607980] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 698.608193] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 698.954355] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470265, 'name': CreateVM_Task, 'duration_secs': 1.305618} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 698.954768] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 698.955484] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 698.955797] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 698.955947] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 698.956207] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-51f3ab7c-f1e1-4699-a278-e43e523d6861 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 698.962251] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Waiting for the task: (returnval){ [ 698.962251] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52d33708-697a-e32a-6b7c-cdc5455ed207" [ 698.962251] env[66583]: _type = "Task" [ 698.962251] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 698.971214] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52d33708-697a-e32a-6b7c-cdc5455ed207, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 699.326913] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 699.327246] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 699.477361] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 699.477361] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 699.477361] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.500110] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Received event network-changed-0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.500868] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Refreshing instance network info cache due to event network-changed-0fc8bf3d-1bab-4f7d-8a32-d78076f86912. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 700.501184] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Acquiring lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.501416] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Acquired lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.501625] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Refreshing network info cache for port 0fc8bf3d-1bab-4f7d-8a32-d78076f86912 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 700.579770] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Acquiring lock "63244459-f37b-4fdb-8afc-9e4a80156099" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.580524] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "63244459-f37b-4fdb-8afc-9e4a80156099" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.902500] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Updated VIF entry in instance network info cache for port 0fc8bf3d-1bab-4f7d-8a32-d78076f86912. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 700.902876] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Updating instance_info_cache with network_info: [{"id": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "address": "fa:16:3e:23:28:ac", "network": {"id": "4fabb88d-b4d0-4f5e-b8d0-85f319db1cd5", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-1392379790-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a067f662226f4250a146d993b8732e79", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "49b5df12-d801-4140-8816-2fd401608c7d", "external-id": "nsx-vlan-transportzone-326", "segmentation_id": 326, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0fc8bf3d-1b", "ovs_interfaceid": "0fc8bf3d-1bab-4f7d-8a32-d78076f86912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 700.918539] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Releasing lock "refresh_cache-0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 700.918800] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Received event network-vif-plugged-0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.919098] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Acquiring lock "a14dfb60-e62a-4a74-9f5b-f031814c609e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 700.919204] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Lock "a14dfb60-e62a-4a74-9f5b-f031814c609e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 700.919521] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Lock "a14dfb60-e62a-4a74-9f5b-f031814c609e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 700.919521] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] No waiting events found dispatching network-vif-plugged-0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 700.919673] env[66583]: WARNING nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Received unexpected event network-vif-plugged-0da63175-e8f6-4a88-89b8-e0a9f1de84e8 for instance with vm_state building and task_state spawning. [ 700.919831] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Received event network-changed-0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 700.919993] env[66583]: DEBUG nova.compute.manager [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Refreshing instance network info cache due to event network-changed-0da63175-e8f6-4a88-89b8-e0a9f1de84e8. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 700.920170] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Acquiring lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 700.920304] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Acquired lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 700.920603] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Refreshing network info cache for port 0da63175-e8f6-4a88-89b8-e0a9f1de84e8 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 701.469305] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Updated VIF entry in instance network info cache for port 0da63175-e8f6-4a88-89b8-e0a9f1de84e8. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 701.469688] env[66583]: DEBUG nova.network.neutron [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Updating instance_info_cache with network_info: [{"id": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "address": "fa:16:3e:53:64:68", "network": {"id": "100e8d9d-d3ca-419b-9d79-3cb940cd4a67", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1959654231-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "6f5002b79bc247d8ba9a54f5bdeacfec", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7b83383f-ed7a-4efd-aef7-aa8c15649d07", "external-id": "nsx-vlan-transportzone-282", "segmentation_id": 282, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0da63175-e8", "ovs_interfaceid": "0da63175-e8f6-4a88-89b8-e0a9f1de84e8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 701.494259] env[66583]: DEBUG oslo_concurrency.lockutils [req-a0c953f6-b27d-4017-a113-217153cf27ca req-85073692-0188-4e16-8198-80fb8af75b83 service nova] Releasing lock "refresh_cache-a14dfb60-e62a-4a74-9f5b-f031814c609e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 701.582201] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "a14582eb-f78f-44d6-8c82-16976c0cec5b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 701.582482] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "a14582eb-f78f-44d6-8c82-16976c0cec5b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.157874] env[66583]: WARNING oslo_vmware.rw_handles [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 705.157874] env[66583]: ERROR oslo_vmware.rw_handles [ 705.157874] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 705.160527] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 705.160527] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Copying Virtual Disk [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/1f9d2735-6dff-4a19-8211-d6224007e1bd/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 705.161075] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6593e584-02cb-4322-a5a4-a2a6f475c615 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.173534] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 705.173534] env[66583]: value = "task-3470269" [ 705.173534] env[66583]: _type = "Task" [ 705.173534] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.184982] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470269, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.690575] env[66583]: DEBUG oslo_vmware.exceptions [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 705.690993] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 705.691476] env[66583]: ERROR nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.691476] env[66583]: Faults: ['InvalidArgument'] [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Traceback (most recent call last): [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] yield resources [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self.driver.spawn(context, instance, image_meta, [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self._vmops.spawn(context, instance, image_meta, injected_files, [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self._fetch_image_if_missing(context, vi) [ 705.691476] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] image_cache(vi, tmp_image_ds_loc) [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] vm_util.copy_virtual_disk( [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] session._wait_for_task(vmdk_copy_task) [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return self.wait_for_task(task_ref) [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return evt.wait() [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] result = hub.switch() [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 705.692016] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return self.greenlet.switch() [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self.f(*self.args, **self.kw) [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] raise exceptions.translate_fault(task_info.error) [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Faults: ['InvalidArgument'] [ 705.692671] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] [ 705.692671] env[66583]: INFO nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Terminating instance [ 705.693979] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 705.693979] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 705.693979] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-43f5a373-8703-4991-ba5c-cad653101d76 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.700749] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 705.700975] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 705.701821] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7468dd57-50c0-4bee-83c2-96e4dfaed666 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.716876] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 705.718902] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-236a0925-6dd6-4365-891e-0498010863d2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.720699] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 705.720955] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 705.722110] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-88779d20-36c3-4f6c-ad5d-c9bf8e04885b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.734342] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for the task: (returnval){ [ 705.734342] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52426787-2e80-06e0-56f7-6bd384327203" [ 705.734342] env[66583]: _type = "Task" [ 705.734342] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.747713] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52426787-2e80-06e0-56f7-6bd384327203, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 705.817996] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 705.817996] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 705.818181] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Deleting the datastore file [datastore2] fac7d6a8-d74e-4130-8068-236289d5d616 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 705.818749] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-79512b6c-9571-4884-aea4-e28567785f61 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 705.841914] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 705.841914] env[66583]: value = "task-3470272" [ 705.841914] env[66583]: _type = "Task" [ 705.841914] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 705.853416] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470272, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 706.247774] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 706.248097] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Creating directory with path [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 706.248383] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7413bdf6-1eec-4763-b878-07da91f92687 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.263713] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Created directory with path [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 706.264558] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Fetch image to [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 706.264825] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 706.266135] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a30c5219-95c4-4a32-aff5-1f38ecb32798 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.273920] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f167f1fe-1a34-4a06-85ea-45e823bbaf36 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.285273] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6af33bdd-68d0-47b9-803d-257e0611d99f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.323659] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a9cb4f6-c1ca-468f-88d3-bc435faecf02 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.332252] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6f000d57-f451-4b93-98c0-1ecc875f5302 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.351431] env[66583]: DEBUG oslo_vmware.api [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470272, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.084016} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 706.351608] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 706.351768] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 706.351943] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 706.352147] env[66583]: INFO nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Took 0.65 seconds to destroy the instance on the hypervisor. [ 706.354616] env[66583]: DEBUG nova.compute.claims [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 706.354756] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.354980] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.360182] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 706.426110] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 706.496021] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 706.496021] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 706.694916] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee43c37a-3098-4136-8e6f-ec59f9b0b9af {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.703160] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abe27276-67fd-408c-a337-b6a35b43fd06 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.738703] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d2a53d6-3663-4da9-8aaa-9eb21e0a8bc8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.747423] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c24e25-2775-4f0e-a180-1e6a6d2b1465 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.762606] env[66583]: DEBUG nova.compute.provider_tree [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 706.778177] env[66583]: DEBUG nova.scheduler.client.report [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 706.802260] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.445s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.802260] env[66583]: ERROR nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.802260] env[66583]: Faults: ['InvalidArgument'] [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Traceback (most recent call last): [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self.driver.spawn(context, instance, image_meta, [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self._vmops.spawn(context, instance, image_meta, injected_files, [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 706.802260] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self._fetch_image_if_missing(context, vi) [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] image_cache(vi, tmp_image_ds_loc) [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] vm_util.copy_virtual_disk( [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] session._wait_for_task(vmdk_copy_task) [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return self.wait_for_task(task_ref) [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return evt.wait() [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] result = hub.switch() [ 706.802502] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] return self.greenlet.switch() [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] self.f(*self.args, **self.kw) [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] raise exceptions.translate_fault(task_info.error) [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Faults: ['InvalidArgument'] [ 706.802835] env[66583]: ERROR nova.compute.manager [instance: fac7d6a8-d74e-4130-8068-236289d5d616] [ 706.802835] env[66583]: DEBUG nova.compute.utils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 706.803601] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Build of instance fac7d6a8-d74e-4130-8068-236289d5d616 was re-scheduled: A specified parameter was not correct: fileType [ 706.803601] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 706.803925] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 706.804107] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 706.804264] env[66583]: DEBUG nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 706.804427] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 707.806564] env[66583]: DEBUG nova.network.neutron [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 707.824965] env[66583]: INFO nova.compute.manager [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: fac7d6a8-d74e-4130-8068-236289d5d616] Took 1.02 seconds to deallocate network for instance. [ 707.926066] env[66583]: INFO nova.scheduler.client.report [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Deleted allocations for instance fac7d6a8-d74e-4130-8068-236289d5d616 [ 707.947323] env[66583]: DEBUG oslo_concurrency.lockutils [None req-63df7daf-c7f4-4934-b77e-886103e34ae5 tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "fac7d6a8-d74e-4130-8068-236289d5d616" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.504s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 707.962098] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 708.025921] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 708.026241] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 708.028184] env[66583]: INFO nova.compute.claims [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 708.309183] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcce0634-91e8-4a02-814e-3c0967caf23e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.318042] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09d4bcd2-27fc-48ca-9677-7df08707f111 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.350185] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10b93d7d-01e7-4128-9651-123d3c5014c8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.358615] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5cda48d-ddca-4de5-98e2-9545889548df {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.372456] env[66583]: DEBUG nova.compute.provider_tree [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 708.382276] env[66583]: DEBUG nova.scheduler.client.report [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 708.405160] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.405160] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 708.475025] env[66583]: DEBUG nova.compute.utils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 708.475914] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 708.481663] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 708.491106] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 708.579104] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 708.604353] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 708.604353] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 708.604353] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 708.605267] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 708.605267] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 708.605267] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 708.605377] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 708.605571] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 708.605799] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 708.605986] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 708.606182] env[66583]: DEBUG nova.virt.hardware [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 708.607070] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec85f8b1-6c00-40a5-871d-4d0b5c569358 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.617219] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7c53f69-5759-4536-b871-3db88c1a623d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 708.707510] env[66583]: DEBUG nova.policy [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '781ac8c793a3466680922508efae64c9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '856f3dbf758244648194dc089dee69b1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 709.988017] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fc79f6a0-9420-4e07-ac7c-1bebac286d1d tempest-ServerActionsTestOtherB-1703846471 tempest-ServerActionsTestOtherB-1703846471-project-member] Acquiring lock "408735e7-0c1b-406e-b72d-8a0396830264" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 709.988324] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fc79f6a0-9420-4e07-ac7c-1bebac286d1d tempest-ServerActionsTestOtherB-1703846471 tempest-ServerActionsTestOtherB-1703846471-project-member] Lock "408735e7-0c1b-406e-b72d-8a0396830264" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 710.000572] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Successfully created port: 19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.286009] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Successfully updated port: 19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 711.297875] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 711.298041] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquired lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 711.298269] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.381816] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.776672] env[66583]: DEBUG nova.compute.manager [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Received event network-vif-plugged-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 711.777061] env[66583]: DEBUG oslo_concurrency.lockutils [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] Acquiring lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 711.777438] env[66583]: DEBUG oslo_concurrency.lockutils [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] Lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 711.778836] env[66583]: DEBUG oslo_concurrency.lockutils [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] Lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 711.778836] env[66583]: DEBUG nova.compute.manager [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] No waiting events found dispatching network-vif-plugged-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 711.778836] env[66583]: WARNING nova.compute.manager [req-51c4ed27-9526-4344-b09c-0a8cf0f85ae8 req-ec50808a-9ced-4bc2-a466-0c5ff5677fa8 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Received unexpected event network-vif-plugged-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c for instance with vm_state building and task_state spawning. [ 711.782251] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Updating instance_info_cache with network_info: [{"id": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "address": "fa:16:3e:b5:d2:16", "network": {"id": "4f62c14e-4f05-4696-bbac-01e04a7cc44f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-123920933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "856f3dbf758244648194dc089dee69b1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7c708997-9b6e-4c27-8a58-02c0d1359d5c", "external-id": "nsx-vlan-transportzone-370", "segmentation_id": 370, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap19aeb4fb-ff", "ovs_interfaceid": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.804883] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Releasing lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 711.804883] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance network_info: |[{"id": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "address": "fa:16:3e:b5:d2:16", "network": {"id": "4f62c14e-4f05-4696-bbac-01e04a7cc44f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-123920933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "856f3dbf758244648194dc089dee69b1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7c708997-9b6e-4c27-8a58-02c0d1359d5c", "external-id": "nsx-vlan-transportzone-370", "segmentation_id": 370, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap19aeb4fb-ff", "ovs_interfaceid": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 711.805130] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b5:d2:16', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7c708997-9b6e-4c27-8a58-02c0d1359d5c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '19aeb4fb-ff1b-49f7-978f-0b3c70290a9c', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 711.823094] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Creating folder: Project (856f3dbf758244648194dc089dee69b1). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.824293] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a4d2a685-8751-41e9-87c5-4719c6006912 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.845847] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Created folder: Project (856f3dbf758244648194dc089dee69b1) in parent group-v693485. [ 711.846061] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Creating folder: Instances. Parent ref: group-v693527. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.846307] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a2f44472-7761-4dad-806e-207b66b46902 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.858104] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Created folder: Instances in parent group-v693527. [ 711.858557] env[66583]: DEBUG oslo.service.loopingcall [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 711.858557] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 711.858740] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9044d9b6-fd12-4146-b605-d26b7d366d4f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 711.881608] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 711.881608] env[66583]: value = "task-3470278" [ 711.881608] env[66583]: _type = "Task" [ 711.881608] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 711.893249] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470278, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.392255] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470278, 'name': CreateVM_Task, 'duration_secs': 0.286146} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 712.392533] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 712.393229] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 712.393461] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 712.393878] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 712.394189] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5fe1f7af-01c9-4ab3-8ebe-e956b2baf690 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 712.399635] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Waiting for the task: (returnval){ [ 712.399635] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52d8c22a-8fe8-cfd6-13c3-65f2648e0c1e" [ 712.399635] env[66583]: _type = "Task" [ 712.399635] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 712.408457] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52d8c22a-8fe8-cfd6-13c3-65f2648e0c1e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 712.916738] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 712.916738] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 712.916738] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 715.441974] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "87acbe03-624d-454c-b108-0566ca0d750e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 715.442338] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "87acbe03-624d-454c-b108-0566ca0d750e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.464303] env[66583]: DEBUG nova.compute.manager [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Received event network-changed-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 716.464552] env[66583]: DEBUG nova.compute.manager [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Refreshing instance network info cache due to event network-changed-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 716.464718] env[66583]: DEBUG oslo_concurrency.lockutils [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] Acquiring lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 716.464927] env[66583]: DEBUG oslo_concurrency.lockutils [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] Acquired lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 716.469379] env[66583]: DEBUG nova.network.neutron [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Refreshing network info cache for port 19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 716.539141] env[66583]: DEBUG oslo_concurrency.lockutils [None req-9cede889-2fbf-4812-833f-f974e9d0992a tempest-AttachInterfacesUnderV243Test-18895844 tempest-AttachInterfacesUnderV243Test-18895844-project-member] Acquiring lock "a5fa8d3d-ad60-4749-bba1-0e00538a543f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 716.539526] env[66583]: DEBUG oslo_concurrency.lockutils [None req-9cede889-2fbf-4812-833f-f974e9d0992a tempest-AttachInterfacesUnderV243Test-18895844 tempest-AttachInterfacesUnderV243Test-18895844-project-member] Lock "a5fa8d3d-ad60-4749-bba1-0e00538a543f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 716.984434] env[66583]: DEBUG nova.network.neutron [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Updated VIF entry in instance network info cache for port 19aeb4fb-ff1b-49f7-978f-0b3c70290a9c. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 716.984434] env[66583]: DEBUG nova.network.neutron [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Updating instance_info_cache with network_info: [{"id": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "address": "fa:16:3e:b5:d2:16", "network": {"id": "4f62c14e-4f05-4696-bbac-01e04a7cc44f", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-123920933-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "856f3dbf758244648194dc089dee69b1", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7c708997-9b6e-4c27-8a58-02c0d1359d5c", "external-id": "nsx-vlan-transportzone-370", "segmentation_id": 370, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap19aeb4fb-ff", "ovs_interfaceid": "19aeb4fb-ff1b-49f7-978f-0b3c70290a9c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 716.993666] env[66583]: DEBUG oslo_concurrency.lockutils [req-592d9e5b-85b7-4ca7-9372-ae5793803fed req-a71be01b-3a52-4c4c-935a-7b7253947147 service nova] Releasing lock "refresh_cache-4fde404c-9011-4e1a-8b3c-8c89e5e45c00" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 717.945098] env[66583]: DEBUG oslo_concurrency.lockutils [None req-682c93b0-1784-49bb-8a4e-923c025cf824 tempest-ServersTestBootFromVolume-1638954955 tempest-ServersTestBootFromVolume-1638954955-project-member] Acquiring lock "0a575fbd-2390-401a-8df0-47a40e187c87" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 717.945505] env[66583]: DEBUG oslo_concurrency.lockutils [None req-682c93b0-1784-49bb-8a4e-923c025cf824 tempest-ServersTestBootFromVolume-1638954955 tempest-ServersTestBootFromVolume-1638954955-project-member] Lock "0a575fbd-2390-401a-8df0-47a40e187c87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 718.476129] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b3654290-ffbc-4a3b-8b9f-dc572cd7feb0 tempest-ServerMetadataTestJSON-1507349439 tempest-ServerMetadataTestJSON-1507349439-project-member] Acquiring lock "1a9f02ca-7220-490c-81ed-bf2422173315" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 718.476710] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b3654290-ffbc-4a3b-8b9f-dc572cd7feb0 tempest-ServerMetadataTestJSON-1507349439 tempest-ServerMetadataTestJSON-1507349439-project-member] Lock "1a9f02ca-7220-490c-81ed-bf2422173315" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.249129] env[66583]: DEBUG oslo_concurrency.lockutils [None req-236ee1f0-ffad-4b81-864d-ae206bb4dd43 tempest-ServerActionsTestJSON-55625096 tempest-ServerActionsTestJSON-55625096-project-member] Acquiring lock "2f03a941-3722-4df8-af76-3bd073f8927b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.249405] env[66583]: DEBUG oslo_concurrency.lockutils [None req-236ee1f0-ffad-4b81-864d-ae206bb4dd43 tempest-ServerActionsTestJSON-55625096 tempest-ServerActionsTestJSON-55625096-project-member] Lock "2f03a941-3722-4df8-af76-3bd073f8927b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.631877] env[66583]: DEBUG oslo_concurrency.lockutils [None req-11979b60-2afd-4522-8f39-da403183c148 tempest-AttachVolumeTestJSON-25219850 tempest-AttachVolumeTestJSON-25219850-project-member] Acquiring lock "d04a1c66-b45e-4266-9e98-2682f7fa42d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.632167] env[66583]: DEBUG oslo_concurrency.lockutils [None req-11979b60-2afd-4522-8f39-da403183c148 tempest-AttachVolumeTestJSON-25219850 tempest-AttachVolumeTestJSON-25219850-project-member] Lock "d04a1c66-b45e-4266-9e98-2682f7fa42d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 721.313660] env[66583]: DEBUG oslo_concurrency.lockutils [None req-cbe21ec0-289f-4a4f-ab3e-5aa593f92ec5 tempest-ServerGroupTestJSON-906624891 tempest-ServerGroupTestJSON-906624891-project-member] Acquiring lock "b3cb9c35-714c-4ce5-b826-0c8398ed93b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 721.313942] env[66583]: DEBUG oslo_concurrency.lockutils [None req-cbe21ec0-289f-4a4f-ab3e-5aa593f92ec5 tempest-ServerGroupTestJSON-906624891 tempest-ServerGroupTestJSON-906624891-project-member] Lock "b3cb9c35-714c-4ce5-b826-0c8398ed93b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 739.847013] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 739.847330] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.847126] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.847465] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.847509] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 740.847642] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 740.858583] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.858804] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.858980] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 740.859155] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 740.860349] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de8793b7-783b-4e76-bc8b-fd911df47449 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.869858] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9659f39b-6430-420a-90fb-fb077afce3ec {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.883851] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e389c407-80b7-4af8-95bb-58b8dd40a1e0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.890280] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3715953b-c93b-4200-8d38-f73c54c8e3f0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 740.921130] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180880MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 740.921289] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 740.921484] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 740.984067] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance f5415bfe-3f3a-4f4b-985d-59655791bb2b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984261] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a0bd3693-ed3f-4573-8250-85ae19a08869 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984404] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance da4dff27-123e-44ac-83b5-1b2b3d731e0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984527] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 3816b87a-030d-4362-9596-bd0899455e52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984649] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984766] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984880] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.984994] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a14dfb60-e62a-4a74-9f5b-f031814c609e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 740.985126] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 741.009786] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.032898] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.043326] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 9915557d-4251-44a2-bf59-3dd542dfb527 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.054402] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.064253] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 63244459-f37b-4fdb-8afc-9e4a80156099 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.073792] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a14582eb-f78f-44d6-8c82-16976c0cec5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.082620] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 408735e7-0c1b-406e-b72d-8a0396830264 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.091610] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 87acbe03-624d-454c-b108-0566ca0d750e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.101044] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a5fa8d3d-ad60-4749-bba1-0e00538a543f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.110670] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 0a575fbd-2390-401a-8df0-47a40e187c87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.122174] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 1a9f02ca-7220-490c-81ed-bf2422173315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.134197] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 2f03a941-3722-4df8-af76-3bd073f8927b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.144446] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance d04a1c66-b45e-4266-9e98-2682f7fa42d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.153458] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance b3cb9c35-714c-4ce5-b826-0c8398ed93b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 741.153683] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 741.153829] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 741.400820] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6440ce-5352-4228-84bf-bec84579d41d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.409235] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c01b8e-5d4a-441f-8ca0-41c79aa722bd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.439821] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba1c6ccc-3281-4be3-9dea-e095a4e3cecb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.446798] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ec037a4-0389-49b4-b04c-966066a249b2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 741.459864] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 741.468506] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 741.481850] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 741.481850] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.560s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 742.477263] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.477495] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.477629] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 742.477793] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 742.496551] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.496720] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.496841] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.496967] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497108] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497238] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497361] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497479] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497599] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 742.497715] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 742.846301] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 742.846571] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 744.615590] env[66583]: WARNING oslo_vmware.rw_handles [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 744.615590] env[66583]: ERROR oslo_vmware.rw_handles [ 744.616228] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 744.617190] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 744.617441] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Copying Virtual Disk [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/ce026a1e-1bf7-4819-97e0-61b1dd2204cf/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 744.617717] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cf6f70d9-38fb-4832-a822-cf2680270abc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 744.626233] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Waiting for the task: (returnval){ [ 744.626233] env[66583]: value = "task-3470279" [ 744.626233] env[66583]: _type = "Task" [ 744.626233] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 744.633772] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Task: {'id': task-3470279, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 745.136870] env[66583]: DEBUG oslo_vmware.exceptions [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 745.137123] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 745.137675] env[66583]: ERROR nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 745.137675] env[66583]: Faults: ['InvalidArgument'] [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Traceback (most recent call last): [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] yield resources [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self.driver.spawn(context, instance, image_meta, [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self._fetch_image_if_missing(context, vi) [ 745.137675] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] image_cache(vi, tmp_image_ds_loc) [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] vm_util.copy_virtual_disk( [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] session._wait_for_task(vmdk_copy_task) [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return self.wait_for_task(task_ref) [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return evt.wait() [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] result = hub.switch() [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 745.138049] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return self.greenlet.switch() [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self.f(*self.args, **self.kw) [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] raise exceptions.translate_fault(task_info.error) [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Faults: ['InvalidArgument'] [ 745.138403] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] [ 745.138403] env[66583]: INFO nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Terminating instance [ 745.139495] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 745.139699] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 745.139918] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c2b7d48-7b76-4cf8-8f89-ab0682869330 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.141937] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 745.142141] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 745.142830] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ee26b9-0487-4562-885d-1ca72bfc327e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.150394] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 745.150601] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4f888437-47f2-4fd6-a9ee-bfceff39dc65 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.152698] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 745.152863] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 745.153818] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ef44705-0c7e-4c44-a1a9-8bc513cf29b3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.158102] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Waiting for the task: (returnval){ [ 745.158102] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52b9b671-942c-bd85-67fd-40f2988794f7" [ 745.158102] env[66583]: _type = "Task" [ 745.158102] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 745.164914] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52b9b671-942c-bd85-67fd-40f2988794f7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 745.218675] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 745.218883] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 745.219080] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Deleting the datastore file [datastore1] 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 745.219429] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-25dced56-904c-407c-9cdd-3b91a11a2fa4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.225323] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Waiting for the task: (returnval){ [ 745.225323] env[66583]: value = "task-3470281" [ 745.225323] env[66583]: _type = "Task" [ 745.225323] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 745.232809] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Task: {'id': task-3470281, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 745.668581] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 745.668913] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Creating directory with path [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 745.668959] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f035849-6b3b-4e9e-af63-a7995bcea51a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.680652] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Created directory with path [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 745.680843] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Fetch image to [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 745.681084] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 745.681743] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be5f1a1b-fbd7-4cb7-a5de-552d846edd4d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.688035] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23cba8f3-3698-455b-b986-a98d82032d86 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.696903] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437adeff-83a0-4ecd-a0f5-302700611f60 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.729357] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf0d0004-86ce-40be-ab3f-ab1c6f027bbb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.736535] env[66583]: DEBUG oslo_vmware.api [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Task: {'id': task-3470281, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085211} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 745.737768] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 745.737954] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 745.738146] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 745.738320] env[66583]: INFO nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 745.740357] env[66583]: DEBUG nova.compute.claims [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 745.740518] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 745.740719] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 745.743122] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f43a4e5d-39e6-40e5-ab10-23094a2ded9b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 745.763201] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 745.814261] env[66583]: DEBUG oslo_vmware.rw_handles [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 745.872999] env[66583]: DEBUG oslo_vmware.rw_handles [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 745.873292] env[66583]: DEBUG oslo_vmware.rw_handles [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 746.100620] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6a06795-9a2e-4b74-8be3-33d8c8f53f71 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.110030] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13f29336-5780-4d41-9487-4a32f8c85670 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.138415] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85c0ec2b-25f6-4a4a-ac9e-86b2cfdf9117 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.145351] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab66b5b6-65e2-45ce-a12b-b1c6a2024f88 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.158150] env[66583]: DEBUG nova.compute.provider_tree [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 746.166552] env[66583]: DEBUG nova.scheduler.client.report [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 746.179655] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.439s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.180074] env[66583]: ERROR nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.180074] env[66583]: Faults: ['InvalidArgument'] [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Traceback (most recent call last): [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self.driver.spawn(context, instance, image_meta, [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self._fetch_image_if_missing(context, vi) [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] image_cache(vi, tmp_image_ds_loc) [ 746.180074] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] vm_util.copy_virtual_disk( [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] session._wait_for_task(vmdk_copy_task) [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return self.wait_for_task(task_ref) [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return evt.wait() [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] result = hub.switch() [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] return self.greenlet.switch() [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 746.180449] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] self.f(*self.args, **self.kw) [ 746.180804] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 746.180804] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] raise exceptions.translate_fault(task_info.error) [ 746.180804] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 746.180804] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Faults: ['InvalidArgument'] [ 746.180804] env[66583]: ERROR nova.compute.manager [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] [ 746.180804] env[66583]: DEBUG nova.compute.utils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 746.182083] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Build of instance 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8 was re-scheduled: A specified parameter was not correct: fileType [ 746.182083] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 746.182459] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 746.182631] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 746.182800] env[66583]: DEBUG nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 746.182961] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 746.397842] env[66583]: DEBUG nova.network.neutron [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 746.408453] env[66583]: INFO nova.compute.manager [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] [instance: 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8] Took 0.23 seconds to deallocate network for instance. [ 746.494728] env[66583]: INFO nova.scheduler.client.report [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Deleted allocations for instance 0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8 [ 746.513522] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b7abeef4-8c09-4de4-9170-cd5983fdd9f9 tempest-ServersNegativeTestMultiTenantJSON-1439013938 tempest-ServersNegativeTestMultiTenantJSON-1439013938-project-member] Lock "0f9f6feb-a561-4dc7-8cf8-71c1ad71faf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 57.527s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.530484] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 746.589761] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 746.590076] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 746.591637] env[66583]: INFO nova.compute.claims [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 746.870821] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69623c2e-3503-4c3b-90cd-830de149db30 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.878358] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf34952b-2e07-422e-ab42-86a6a3633b74 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.909336] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69368506-3921-4e7e-9346-36fb93d7110b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.916342] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55b4c9e5-1527-4944-b7f6-ba6b763d060f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 746.929148] env[66583]: DEBUG nova.compute.provider_tree [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 746.940709] env[66583]: DEBUG nova.scheduler.client.report [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 746.953681] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.364s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 746.954200] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 746.986362] env[66583]: DEBUG nova.compute.utils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 746.987814] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 746.987937] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 746.998199] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 747.064994] env[66583]: DEBUG nova.policy [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '6fcf6fa9ccda4ae1a05ecc8634ee3d1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1186be02d7824c779cf54881dada6b7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 747.068209] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 747.088738] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 747.088982] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 747.089157] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 747.089342] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 747.089487] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 747.089631] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 747.089834] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 747.089988] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 747.090168] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 747.090328] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 747.090499] env[66583]: DEBUG nova.virt.hardware [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 747.091546] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8509cc4-3f78-4a04-9e3e-1e9e3c1a9794 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.099419] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4743c1b6-e38e-493a-b10b-2e58dbdc9503 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 747.885913] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Successfully created port: 4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 748.980937] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Successfully updated port: 4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 748.998356] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 748.998503] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquired lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 748.998912] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 749.035978] env[66583]: DEBUG nova.compute.manager [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Received event network-vif-plugged-4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 749.036219] env[66583]: DEBUG oslo_concurrency.lockutils [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] Acquiring lock "12bc9e29-ecea-40e9-af34-a067f3d2301f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 749.036600] env[66583]: DEBUG oslo_concurrency.lockutils [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] Lock "12bc9e29-ecea-40e9-af34-a067f3d2301f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 749.036600] env[66583]: DEBUG oslo_concurrency.lockutils [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] Lock "12bc9e29-ecea-40e9-af34-a067f3d2301f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 749.036804] env[66583]: DEBUG nova.compute.manager [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] No waiting events found dispatching network-vif-plugged-4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 749.037573] env[66583]: WARNING nova.compute.manager [req-afdf10c9-b7af-4083-aa86-f6a25c83c849 req-f9885a0a-9940-4f1c-af52-3e0c399416a0 service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Received unexpected event network-vif-plugged-4a5ea210-e518-4297-bb4d-a3844f9a1b1e for instance with vm_state building and task_state spawning. [ 749.057107] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 749.394656] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Updating instance_info_cache with network_info: [{"id": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "address": "fa:16:3e:08:1f:bb", "network": {"id": "98576caf-7fbe-4b5d-91b1-de447a8d8789", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-975850535-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1186be02d7824c779cf54881dada6b7b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a5ea210-e5", "ovs_interfaceid": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 749.409036] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Releasing lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 749.409353] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance network_info: |[{"id": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "address": "fa:16:3e:08:1f:bb", "network": {"id": "98576caf-7fbe-4b5d-91b1-de447a8d8789", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-975850535-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1186be02d7824c779cf54881dada6b7b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a5ea210-e5", "ovs_interfaceid": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 749.409726] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:08:1f:bb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ff1f3320-df8e-49df-a412-9797a23bd173', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4a5ea210-e518-4297-bb4d-a3844f9a1b1e', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 749.417233] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Creating folder: Project (1186be02d7824c779cf54881dada6b7b). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 749.417770] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ba593d1-804f-4149-9894-838c59bdbd6b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.428612] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Created folder: Project (1186be02d7824c779cf54881dada6b7b) in parent group-v693485. [ 749.428811] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Creating folder: Instances. Parent ref: group-v693530. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 749.429049] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-144acadb-bd23-47ff-9f92-cb9a1672de7f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.437914] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Created folder: Instances in parent group-v693530. [ 749.438280] env[66583]: DEBUG oslo.service.loopingcall [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 749.438376] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 749.438527] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cdad56a8-96a5-466c-8e96-bb7737463649 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.463522] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 749.463522] env[66583]: value = "task-3470284" [ 749.463522] env[66583]: _type = "Task" [ 749.463522] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 749.472022] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470284, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 749.978114] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470284, 'name': CreateVM_Task, 'duration_secs': 0.334134} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 749.978473] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 749.979280] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 749.979593] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 749.980101] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 749.980655] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c232b3a-3d5f-42dd-963b-f12092852b13 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 749.986691] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Waiting for the task: (returnval){ [ 749.986691] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52834129-42be-0483-edf6-919e42bf25d9" [ 749.986691] env[66583]: _type = "Task" [ 749.986691] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 749.994695] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52834129-42be-0483-edf6-919e42bf25d9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 750.496369] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 750.497219] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 750.497695] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 751.369312] env[66583]: DEBUG nova.compute.manager [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Received event network-changed-4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 751.369596] env[66583]: DEBUG nova.compute.manager [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Refreshing instance network info cache due to event network-changed-4a5ea210-e518-4297-bb4d-a3844f9a1b1e. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 751.369722] env[66583]: DEBUG oslo_concurrency.lockutils [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] Acquiring lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 751.369863] env[66583]: DEBUG oslo_concurrency.lockutils [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] Acquired lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 751.370028] env[66583]: DEBUG nova.network.neutron [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Refreshing network info cache for port 4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 752.210983] env[66583]: DEBUG nova.network.neutron [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Updated VIF entry in instance network info cache for port 4a5ea210-e518-4297-bb4d-a3844f9a1b1e. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 752.213327] env[66583]: DEBUG nova.network.neutron [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Updating instance_info_cache with network_info: [{"id": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "address": "fa:16:3e:08:1f:bb", "network": {"id": "98576caf-7fbe-4b5d-91b1-de447a8d8789", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-975850535-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "1186be02d7824c779cf54881dada6b7b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ff1f3320-df8e-49df-a412-9797a23bd173", "external-id": "nsx-vlan-transportzone-217", "segmentation_id": 217, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4a5ea210-e5", "ovs_interfaceid": "4a5ea210-e518-4297-bb4d-a3844f9a1b1e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 752.221267] env[66583]: DEBUG oslo_concurrency.lockutils [req-9c5dd0d8-7da6-4987-a3d8-085c4cc59315 req-75bf2812-266a-464e-9621-fe461273059e service nova] Releasing lock "refresh_cache-12bc9e29-ecea-40e9-af34-a067f3d2301f" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.172821] env[66583]: WARNING oslo_vmware.rw_handles [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 755.172821] env[66583]: ERROR oslo_vmware.rw_handles [ 755.173555] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 755.174923] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 755.175201] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Copying Virtual Disk [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/e235aa03-2b5d-4b57-8967-bb5f118c3116/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 755.175461] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d3b82144-ff80-488d-9586-c5c8ec4c71b5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.183932] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for the task: (returnval){ [ 755.183932] env[66583]: value = "task-3470285" [ 755.183932] env[66583]: _type = "Task" [ 755.183932] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 755.191291] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Task: {'id': task-3470285, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 755.694654] env[66583]: DEBUG oslo_vmware.exceptions [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 755.694929] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 755.695506] env[66583]: ERROR nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 755.695506] env[66583]: Faults: ['InvalidArgument'] [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Traceback (most recent call last): [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] yield resources [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self.driver.spawn(context, instance, image_meta, [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self._fetch_image_if_missing(context, vi) [ 755.695506] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] image_cache(vi, tmp_image_ds_loc) [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] vm_util.copy_virtual_disk( [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] session._wait_for_task(vmdk_copy_task) [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return self.wait_for_task(task_ref) [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return evt.wait() [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] result = hub.switch() [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 755.695836] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return self.greenlet.switch() [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self.f(*self.args, **self.kw) [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] raise exceptions.translate_fault(task_info.error) [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Faults: ['InvalidArgument'] [ 755.696230] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] [ 755.696230] env[66583]: INFO nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Terminating instance [ 755.698365] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 755.698555] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquired lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.698739] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.699724] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 755.699928] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 755.700169] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5ac51ad4-fe6b-4369-92f7-7ec764832cd7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.708391] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 755.708613] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 755.710175] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0705d376-2669-424f-9534-b30c62bc8fa5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 755.718472] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Waiting for the task: (returnval){ [ 755.718472] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]529cc4ff-c18c-860a-ed2b-92fc21a075d2" [ 755.718472] env[66583]: _type = "Task" [ 755.718472] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 755.728017] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]529cc4ff-c18c-860a-ed2b-92fc21a075d2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 755.731727] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 756.055801] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 756.063029] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Releasing lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 756.063234] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 756.063482] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 756.064497] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-733c0e8a-4fd2-44f2-8b09-f35b2892abc1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.072296] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 756.072500] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5d7b9496-0814-4ff7-89ef-5982b775fd14 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.114857] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 756.115090] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 756.115254] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Deleting the datastore file [datastore2] f5415bfe-3f3a-4f4b-985d-59655791bb2b {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 756.115490] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9a09d784-ee57-43ce-8460-a1506841412a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.121753] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for the task: (returnval){ [ 756.121753] env[66583]: value = "task-3470287" [ 756.121753] env[66583]: _type = "Task" [ 756.121753] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 756.128749] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Task: {'id': task-3470287, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 756.227992] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 756.228333] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Creating directory with path [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 756.228483] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3d668a5b-0150-47fe-bd04-1d1c4f922d80 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.239332] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Created directory with path [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 756.239531] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Fetch image to [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 756.239703] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 756.240439] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a519b95-cec6-4699-9671-b8653fc48648 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.247062] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f089b942-4b3b-44fd-9d9b-55e8c451500d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.256197] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d2425a4-3637-4d30-8668-1714a7a2f2db {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.288106] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce38e575-4a35-4b73-b487-6f3882a9937f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.293108] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aca3d788-69a9-487b-b587-c5426b4081d7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.314946] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 756.359455] env[66583]: DEBUG oslo_vmware.rw_handles [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 756.417650] env[66583]: DEBUG oslo_vmware.rw_handles [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 756.417838] env[66583]: DEBUG oslo_vmware.rw_handles [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 756.631592] env[66583]: DEBUG oslo_vmware.api [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Task: {'id': task-3470287, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.033142} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 756.631859] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 756.632055] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 756.632236] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 756.632409] env[66583]: INFO nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Took 0.57 seconds to destroy the instance on the hypervisor. [ 756.632647] env[66583]: DEBUG oslo.service.loopingcall [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 756.632850] env[66583]: DEBUG nova.compute.manager [-] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Skipping network deallocation for instance since networking was not requested. {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 756.635023] env[66583]: DEBUG nova.compute.claims [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 756.635196] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 756.635410] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 756.916797] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b54180f0-c7e1-46f7-90fd-a2b84964fca5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.924368] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6ca212a-8462-4b5f-a700-9e0510c9592c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.953061] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08f55e32-b364-4ec1-9697-3d4f5ace7824 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.959601] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b9e2b2-55f8-44d3-a75d-444d1cc6e498 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 756.973136] env[66583]: DEBUG nova.compute.provider_tree [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 756.982467] env[66583]: DEBUG nova.scheduler.client.report [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 756.995509] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.360s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 756.996095] env[66583]: ERROR nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 756.996095] env[66583]: Faults: ['InvalidArgument'] [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Traceback (most recent call last): [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self.driver.spawn(context, instance, image_meta, [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self._fetch_image_if_missing(context, vi) [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] image_cache(vi, tmp_image_ds_loc) [ 756.996095] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] vm_util.copy_virtual_disk( [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] session._wait_for_task(vmdk_copy_task) [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return self.wait_for_task(task_ref) [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return evt.wait() [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] result = hub.switch() [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] return self.greenlet.switch() [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 756.996692] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] self.f(*self.args, **self.kw) [ 756.997310] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 756.997310] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] raise exceptions.translate_fault(task_info.error) [ 756.997310] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 756.997310] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Faults: ['InvalidArgument'] [ 756.997310] env[66583]: ERROR nova.compute.manager [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] [ 756.997310] env[66583]: DEBUG nova.compute.utils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 756.998583] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Build of instance f5415bfe-3f3a-4f4b-985d-59655791bb2b was re-scheduled: A specified parameter was not correct: fileType [ 756.998583] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 756.999054] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 756.999327] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquiring lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 756.999531] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Acquired lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 756.999753] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 757.022573] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 757.079857] env[66583]: DEBUG nova.network.neutron [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 757.087974] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Releasing lock "refresh_cache-f5415bfe-3f3a-4f4b-985d-59655791bb2b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 757.088207] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 757.088391] env[66583]: DEBUG nova.compute.manager [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] [instance: f5415bfe-3f3a-4f4b-985d-59655791bb2b] Skipping network deallocation for instance since networking was not requested. {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 757.164716] env[66583]: INFO nova.scheduler.client.report [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Deleted allocations for instance f5415bfe-3f3a-4f4b-985d-59655791bb2b [ 757.181513] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8df86ea4-8e2e-4e00-9c00-bb1809d42abc tempest-ServersAdmin275Test-1587685944 tempest-ServersAdmin275Test-1587685944-project-member] Lock "f5415bfe-3f3a-4f4b-985d-59655791bb2b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 149.475s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.196238] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 757.243283] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 757.243591] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 757.245122] env[66583]: INFO nova.compute.claims [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 757.537903] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfc73527-abc1-4cb4-bfef-5a77fadfdc0b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.545826] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b684910-dacf-4d1a-8ce9-23dcf5905be6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.577264] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62ca33b5-36ac-4339-9b4f-cf7e97027f34 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.584530] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ae1f20-cdb0-40a7-8856-255022125bdb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.597609] env[66583]: DEBUG nova.compute.provider_tree [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 757.605989] env[66583]: DEBUG nova.scheduler.client.report [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 757.622681] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 757.623198] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 757.661060] env[66583]: DEBUG nova.compute.utils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 757.662373] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 757.662601] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 757.671412] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 757.752922] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 757.774891] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 757.775132] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 757.775289] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 757.775486] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 757.775634] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 757.775777] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 757.775980] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 757.776402] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 757.776596] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 757.776763] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 757.776937] env[66583]: DEBUG nova.virt.hardware [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 757.777802] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a5cd23d-e626-4a1e-be86-d8f6e540bfac {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 757.781602] env[66583]: DEBUG nova.policy [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3edca79284fd442b827b360cd499c679', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '97f559f320cc4863a6630dea329de66e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 757.788933] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-807288f4-890d-49fa-9897-517d9995f6c5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 758.342436] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Successfully created port: 442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 759.537621] env[66583]: DEBUG nova.compute.manager [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Received event network-vif-plugged-442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 759.537822] env[66583]: DEBUG oslo_concurrency.lockutils [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] Acquiring lock "6deed686-ceca-45a1-b8e4-2461b2e3f039-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 759.537955] env[66583]: DEBUG oslo_concurrency.lockutils [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] Lock "6deed686-ceca-45a1-b8e4-2461b2e3f039-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 759.538092] env[66583]: DEBUG oslo_concurrency.lockutils [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] Lock "6deed686-ceca-45a1-b8e4-2461b2e3f039-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 759.538262] env[66583]: DEBUG nova.compute.manager [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] No waiting events found dispatching network-vif-plugged-442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 759.538491] env[66583]: WARNING nova.compute.manager [req-6c4c083a-e92b-4e58-84ee-ac0c3ff09059 req-55c8b693-58b6-4137-9c01-d284db313bed service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Received unexpected event network-vif-plugged-442f498f-b231-40f3-9a0f-93dec6c5b76c for instance with vm_state building and task_state spawning. [ 759.539447] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Successfully updated port: 442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 759.552721] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 759.552890] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquired lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 759.553053] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 759.626421] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 759.939817] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Updating instance_info_cache with network_info: [{"id": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "address": "fa:16:3e:78:91:53", "network": {"id": "9c087d6c-a28a-4b0c-90b6-ff919225d2bc", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-645051115-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97f559f320cc4863a6630dea329de66e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap442f498f-b2", "ovs_interfaceid": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 759.952450] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Releasing lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 759.952763] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance network_info: |[{"id": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "address": "fa:16:3e:78:91:53", "network": {"id": "9c087d6c-a28a-4b0c-90b6-ff919225d2bc", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-645051115-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97f559f320cc4863a6630dea329de66e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap442f498f-b2", "ovs_interfaceid": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 759.953762] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:78:91:53', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ba07329-1d3e-4ba8-8774-d029262318c4', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '442f498f-b231-40f3-9a0f-93dec6c5b76c', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 759.961037] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Creating folder: Project (97f559f320cc4863a6630dea329de66e). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 759.961574] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-539adb25-66db-4706-9aab-39ca8102c877 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.973390] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Created folder: Project (97f559f320cc4863a6630dea329de66e) in parent group-v693485. [ 759.973579] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Creating folder: Instances. Parent ref: group-v693533. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 759.973800] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e5eb852-c6f0-4b5a-85c7-b3b394ff218c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 759.983027] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Created folder: Instances in parent group-v693533. [ 759.983027] env[66583]: DEBUG oslo.service.loopingcall [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 759.983227] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 759.983477] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d9ebd0e5-9964-49f2-9259-8db8efc9d01f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.002596] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 760.002596] env[66583]: value = "task-3470290" [ 760.002596] env[66583]: _type = "Task" [ 760.002596] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.012066] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470290, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 760.512792] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470290, 'name': CreateVM_Task, 'duration_secs': 0.285041} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 760.512970] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 760.513899] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 760.514166] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 760.514539] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 760.514785] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d7c7c278-defe-4d65-b510-4f7f249db524 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 760.519087] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Waiting for the task: (returnval){ [ 760.519087] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52239f35-61cb-7d67-f56b-304a57b8933e" [ 760.519087] env[66583]: _type = "Task" [ 760.519087] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 760.528797] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52239f35-61cb-7d67-f56b-304a57b8933e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 761.028995] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 761.029521] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 761.030408] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.593150] env[66583]: DEBUG nova.compute.manager [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Received event network-changed-442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 761.593774] env[66583]: DEBUG nova.compute.manager [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Refreshing instance network info cache due to event network-changed-442f498f-b231-40f3-9a0f-93dec6c5b76c. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 761.593774] env[66583]: DEBUG oslo_concurrency.lockutils [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] Acquiring lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 761.593774] env[66583]: DEBUG oslo_concurrency.lockutils [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] Acquired lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 761.593991] env[66583]: DEBUG nova.network.neutron [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Refreshing network info cache for port 442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 761.945223] env[66583]: DEBUG nova.network.neutron [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Updated VIF entry in instance network info cache for port 442f498f-b231-40f3-9a0f-93dec6c5b76c. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 761.945829] env[66583]: DEBUG nova.network.neutron [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Updating instance_info_cache with network_info: [{"id": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "address": "fa:16:3e:78:91:53", "network": {"id": "9c087d6c-a28a-4b0c-90b6-ff919225d2bc", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-645051115-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "97f559f320cc4863a6630dea329de66e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba07329-1d3e-4ba8-8774-d029262318c4", "external-id": "nsx-vlan-transportzone-534", "segmentation_id": 534, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap442f498f-b2", "ovs_interfaceid": "442f498f-b231-40f3-9a0f-93dec6c5b76c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 761.958041] env[66583]: DEBUG oslo_concurrency.lockutils [req-7cb510ee-d8ec-4ed3-8c99-d678892c93ef req-f2594bb8-0109-40ed-b4b9-9a4a7c4bfa9d service nova] Releasing lock "refresh_cache-6deed686-ceca-45a1-b8e4-2461b2e3f039" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 794.634814] env[66583]: WARNING oslo_vmware.rw_handles [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 794.634814] env[66583]: ERROR oslo_vmware.rw_handles [ 794.635390] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 794.636753] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 794.637008] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Copying Virtual Disk [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/fe2f6095-3ad3-4406-84c0-96c78753950d/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 794.637279] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7abebc31-50f2-49e3-af69-2891d61ee293 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 794.645603] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Waiting for the task: (returnval){ [ 794.645603] env[66583]: value = "task-3470291" [ 794.645603] env[66583]: _type = "Task" [ 794.645603] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 794.653154] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Task: {'id': task-3470291, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 795.157068] env[66583]: DEBUG oslo_vmware.exceptions [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 795.157248] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 795.157814] env[66583]: ERROR nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 795.157814] env[66583]: Faults: ['InvalidArgument'] [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Traceback (most recent call last): [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] yield resources [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self.driver.spawn(context, instance, image_meta, [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self._fetch_image_if_missing(context, vi) [ 795.157814] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] image_cache(vi, tmp_image_ds_loc) [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] vm_util.copy_virtual_disk( [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] session._wait_for_task(vmdk_copy_task) [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return self.wait_for_task(task_ref) [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return evt.wait() [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] result = hub.switch() [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 795.158209] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return self.greenlet.switch() [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self.f(*self.args, **self.kw) [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] raise exceptions.translate_fault(task_info.error) [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Faults: ['InvalidArgument'] [ 795.158611] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] [ 795.158611] env[66583]: INFO nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Terminating instance [ 795.160929] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 795.161137] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 795.161959] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-549cb8b9-1c79-4a10-9fd7-190879c502e4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 795.169251] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 795.169465] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f0076bc2-e41c-4e47-b9d1-d74685012f2b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.064913] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 796.065251] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 796.065347] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Deleting the datastore file [datastore1] a14dfb60-e62a-4a74-9f5b-f031814c609e {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 796.065598] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d021d9a7-d69c-4b1b-a20e-7674e4b3fe86 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.071986] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Waiting for the task: (returnval){ [ 796.071986] env[66583]: value = "task-3470293" [ 796.071986] env[66583]: _type = "Task" [ 796.071986] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 796.079116] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Task: {'id': task-3470293, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 796.581598] env[66583]: DEBUG oslo_vmware.api [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Task: {'id': task-3470293, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074743} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 796.581824] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 796.582019] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 796.582193] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 796.582365] env[66583]: INFO nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Took 1.42 seconds to destroy the instance on the hypervisor. [ 796.585198] env[66583]: DEBUG nova.compute.claims [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 796.585359] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 796.585569] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 796.877705] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14eb42af-6dd8-47ea-b92e-605f7d3010ef {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.885362] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-848a70b5-ad73-41e1-8e33-1d83f2afec07 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.916624] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0786b0c3-4df8-47d4-b068-da22f7db6494 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.923885] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0262319-df67-4f44-a899-0e602c6a149d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 796.936699] env[66583]: DEBUG nova.compute.provider_tree [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 796.947068] env[66583]: DEBUG nova.scheduler.client.report [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 796.962966] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.377s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 796.963525] env[66583]: ERROR nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.963525] env[66583]: Faults: ['InvalidArgument'] [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Traceback (most recent call last): [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self.driver.spawn(context, instance, image_meta, [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self._fetch_image_if_missing(context, vi) [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] image_cache(vi, tmp_image_ds_loc) [ 796.963525] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] vm_util.copy_virtual_disk( [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] session._wait_for_task(vmdk_copy_task) [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return self.wait_for_task(task_ref) [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return evt.wait() [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] result = hub.switch() [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] return self.greenlet.switch() [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 796.963871] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] self.f(*self.args, **self.kw) [ 796.964194] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 796.964194] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] raise exceptions.translate_fault(task_info.error) [ 796.964194] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 796.964194] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Faults: ['InvalidArgument'] [ 796.964194] env[66583]: ERROR nova.compute.manager [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] [ 796.964318] env[66583]: DEBUG nova.compute.utils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 796.965768] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Build of instance a14dfb60-e62a-4a74-9f5b-f031814c609e was re-scheduled: A specified parameter was not correct: fileType [ 796.965768] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 796.966152] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 796.966324] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 796.966494] env[66583]: DEBUG nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 796.966656] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 797.636826] env[66583]: DEBUG nova.network.neutron [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 797.647020] env[66583]: INFO nova.compute.manager [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] [instance: a14dfb60-e62a-4a74-9f5b-f031814c609e] Took 0.68 seconds to deallocate network for instance. [ 797.736113] env[66583]: INFO nova.scheduler.client.report [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Deleted allocations for instance a14dfb60-e62a-4a74-9f5b-f031814c609e [ 797.757098] env[66583]: DEBUG oslo_concurrency.lockutils [None req-ad8cab84-f2c9-4a6a-a5f8-dd0b8c9fe0ae tempest-ServerDiskConfigTestJSON-264814208 tempest-ServerDiskConfigTestJSON-264814208-project-member] Lock "a14dfb60-e62a-4a74-9f5b-f031814c609e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 108.013s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 797.770536] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 797.815919] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 797.816197] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 797.817625] env[66583]: INFO nova.compute.claims [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 798.089360] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feeb4ed1-2cac-42c1-992e-d2f4f61cb727 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.096962] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fc2b382-f677-476b-a0d3-db68e100a483 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.126281] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0039ad7-c334-456c-aea6-58920c62b6d2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.133267] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e95f7597-e6b2-436a-a25c-63366d054b9d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.145979] env[66583]: DEBUG nova.compute.provider_tree [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 798.155601] env[66583]: DEBUG nova.scheduler.client.report [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 798.172474] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.356s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 798.172946] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 798.203742] env[66583]: DEBUG nova.compute.utils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 798.205284] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 798.205354] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 798.214333] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 798.284634] env[66583]: DEBUG nova.policy [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69d3e12b06f34ebd982a349b5075b9c4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c9bbde5730cf4bd5a6fff9cde90323ff', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 798.306506] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 798.328671] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 798.328915] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 798.329087] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 798.329274] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 798.329418] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 798.329562] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 798.329769] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 798.329928] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 798.330104] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 798.330268] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 798.330439] env[66583]: DEBUG nova.virt.hardware [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 798.331292] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88e68bf7-0aa8-4519-863a-3514652f8810 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.338820] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8ec1032-2ed5-46ae-b981-6d44930aa962 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 798.638988] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Successfully created port: d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 799.592857] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Successfully updated port: d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 799.603480] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 799.603641] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquired lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 799.603822] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 799.679448] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 799.842304] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.862994] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 799.986014] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Updating instance_info_cache with network_info: [{"id": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "address": "fa:16:3e:0c:23:dd", "network": {"id": "b83ef089-49b2-4894-8df0-4bbe54829269", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1548729022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c9bbde5730cf4bd5a6fff9cde90323ff", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74c816b2-b8b0-432e-baac-662ed8ea0417", "external-id": "nsx-vlan-transportzone-776", "segmentation_id": 776, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd43f0eaf-b8", "ovs_interfaceid": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 799.997536] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Releasing lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 799.997536] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance network_info: |[{"id": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "address": "fa:16:3e:0c:23:dd", "network": {"id": "b83ef089-49b2-4894-8df0-4bbe54829269", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1548729022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c9bbde5730cf4bd5a6fff9cde90323ff", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74c816b2-b8b0-432e-baac-662ed8ea0417", "external-id": "nsx-vlan-transportzone-776", "segmentation_id": 776, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd43f0eaf-b8", "ovs_interfaceid": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 799.997700] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:23:dd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '74c816b2-b8b0-432e-baac-662ed8ea0417', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd43f0eaf-b8ab-49e8-81fd-d9317a958b0b', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 800.005807] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Creating folder: Project (c9bbde5730cf4bd5a6fff9cde90323ff). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 800.006452] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4d413a0c-9301-40f9-8ee9-289ef846f067 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.019764] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Created folder: Project (c9bbde5730cf4bd5a6fff9cde90323ff) in parent group-v693485. [ 800.019968] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Creating folder: Instances. Parent ref: group-v693536. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 800.020212] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1c22ecb5-c49e-4d0b-ac3b-eddfe42cec36 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.028474] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Created folder: Instances in parent group-v693536. [ 800.028690] env[66583]: DEBUG oslo.service.loopingcall [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 800.028870] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 800.029169] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c09208f5-3601-4404-a897-669bf2695cf6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.048412] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 800.048412] env[66583]: value = "task-3470296" [ 800.048412] env[66583]: _type = "Task" [ 800.048412] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 800.058644] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470296, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 800.236147] env[66583]: DEBUG nova.compute.manager [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Received event network-vif-plugged-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 800.236423] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Acquiring lock "9915557d-4251-44a2-bf59-3dd542dfb527-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 800.236638] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Lock "9915557d-4251-44a2-bf59-3dd542dfb527-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 800.236803] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Lock "9915557d-4251-44a2-bf59-3dd542dfb527-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 800.236964] env[66583]: DEBUG nova.compute.manager [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] No waiting events found dispatching network-vif-plugged-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 800.237276] env[66583]: WARNING nova.compute.manager [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Received unexpected event network-vif-plugged-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b for instance with vm_state building and task_state spawning. [ 800.237473] env[66583]: DEBUG nova.compute.manager [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Received event network-changed-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 800.237574] env[66583]: DEBUG nova.compute.manager [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Refreshing instance network info cache due to event network-changed-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 800.237768] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Acquiring lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.237953] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Acquired lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.238131] env[66583]: DEBUG nova.network.neutron [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Refreshing network info cache for port d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 800.558553] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470296, 'name': CreateVM_Task, 'duration_secs': 0.272493} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 800.558730] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 800.559406] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 800.559572] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 800.559890] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 800.560148] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-109a1e76-1a38-4031-9c58-06111e4bdc78 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 800.564493] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Waiting for the task: (returnval){ [ 800.564493] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52f950f6-2ec0-26bc-3105-3b6a11677c50" [ 800.564493] env[66583]: _type = "Task" [ 800.564493] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 800.573257] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52f950f6-2ec0-26bc-3105-3b6a11677c50, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 800.723632] env[66583]: DEBUG nova.network.neutron [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Updated VIF entry in instance network info cache for port d43f0eaf-b8ab-49e8-81fd-d9317a958b0b. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 800.724053] env[66583]: DEBUG nova.network.neutron [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Updating instance_info_cache with network_info: [{"id": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "address": "fa:16:3e:0c:23:dd", "network": {"id": "b83ef089-49b2-4894-8df0-4bbe54829269", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1548729022-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c9bbde5730cf4bd5a6fff9cde90323ff", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "74c816b2-b8b0-432e-baac-662ed8ea0417", "external-id": "nsx-vlan-transportzone-776", "segmentation_id": 776, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd43f0eaf-b8", "ovs_interfaceid": "d43f0eaf-b8ab-49e8-81fd-d9317a958b0b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 800.733991] env[66583]: DEBUG oslo_concurrency.lockutils [req-40fb4d12-d0f7-4e6b-bd11-e3b7cb3a412d req-b9aedcab-a263-45b4-ba4e-2edc48f08a5b service nova] Releasing lock "refresh_cache-9915557d-4251-44a2-bf59-3dd542dfb527" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 800.847179] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 801.074789] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 801.075111] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 801.075365] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 802.846240] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.846591] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.846653] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.846765] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 802.846917] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 802.856871] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.857100] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.857273] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 802.857427] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 802.858906] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f27568-5672-4e6c-b34a-75ab5c1f8a7e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.867412] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-749f4bf3-07ee-422d-90ae-e3356948b35c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.882043] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-456893e0-ee8e-4191-bce1-089e5d746fe9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.889353] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1907dd4c-ab0c-4a17-b0d7-db4f04f1e0f9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 802.916702] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 802.916842] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 802.917048] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 802.979518] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a0bd3693-ed3f-4573-8250-85ae19a08869 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.979700] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance da4dff27-123e-44ac-83b5-1b2b3d731e0a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.979800] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 3816b87a-030d-4362-9596-bd0899455e52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.979930] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.980067] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.980201] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.980323] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.980441] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.980557] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 9915557d-4251-44a2-bf59-3dd542dfb527 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 802.990951] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.000953] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 63244459-f37b-4fdb-8afc-9e4a80156099 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.009951] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a14582eb-f78f-44d6-8c82-16976c0cec5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.019710] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 408735e7-0c1b-406e-b72d-8a0396830264 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.028788] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 87acbe03-624d-454c-b108-0566ca0d750e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.041046] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a5fa8d3d-ad60-4749-bba1-0e00538a543f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.050199] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 0a575fbd-2390-401a-8df0-47a40e187c87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.061136] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 1a9f02ca-7220-490c-81ed-bf2422173315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.070476] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 2f03a941-3722-4df8-af76-3bd073f8927b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.079381] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance d04a1c66-b45e-4266-9e98-2682f7fa42d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.088294] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance b3cb9c35-714c-4ce5-b826-0c8398ed93b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 803.088530] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 803.088686] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 803.312379] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d77090e-a501-45f4-b96a-e0d1a1c2dff6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.320066] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20ebb5ee-d44a-4db8-836e-9e39e0603a85 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.350274] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1121337-e690-4573-bbe6-91fb675e182a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.357763] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e53647ac-a453-433f-9792-09134afaecba {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 803.370619] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 803.379480] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 803.392306] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 803.392496] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.475s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 804.387546] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.387944] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.387944] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 804.388091] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 804.408094] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408314] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408378] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408511] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408635] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408759] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.408882] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.409015] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.409141] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 804.409262] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 804.409724] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 804.637528] env[66583]: WARNING oslo_vmware.rw_handles [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 804.637528] env[66583]: ERROR oslo_vmware.rw_handles [ 804.638014] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 804.640066] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 804.640347] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Copying Virtual Disk [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/192b56f6-5f4e-4cea-889b-4c125a824909/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 804.640642] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f50a373b-f729-4cbd-8571-2b042c4e6a4a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 804.648548] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Waiting for the task: (returnval){ [ 804.648548] env[66583]: value = "task-3470297" [ 804.648548] env[66583]: _type = "Task" [ 804.648548] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 804.657512] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Task: {'id': task-3470297, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 805.158886] env[66583]: DEBUG oslo_vmware.exceptions [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 805.159121] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 805.159872] env[66583]: ERROR nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 805.159872] env[66583]: Faults: ['InvalidArgument'] [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Traceback (most recent call last): [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] yield resources [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self.driver.spawn(context, instance, image_meta, [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self._vmops.spawn(context, instance, image_meta, injected_files, [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self._fetch_image_if_missing(context, vi) [ 805.159872] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] image_cache(vi, tmp_image_ds_loc) [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] vm_util.copy_virtual_disk( [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] session._wait_for_task(vmdk_copy_task) [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return self.wait_for_task(task_ref) [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return evt.wait() [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] result = hub.switch() [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 805.160225] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return self.greenlet.switch() [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self.f(*self.args, **self.kw) [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] raise exceptions.translate_fault(task_info.error) [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Faults: ['InvalidArgument'] [ 805.160553] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] [ 805.160553] env[66583]: INFO nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Terminating instance [ 805.161470] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 805.161741] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 805.161983] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-64afd079-e5ed-47aa-a381-51bf72522191 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.164757] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 805.164987] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 805.165708] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cf2829f-ae55-420b-86ff-d3d2539f06fb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.169602] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 805.169773] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 805.172269] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3c3efc20-db18-4339-92a7-a3fe7feac0ef {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.174400] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 805.174609] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a68c2516-5e67-4247-a22c-dad2d855877e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.179019] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for the task: (returnval){ [ 805.179019] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]525a32f0-8bd9-92c1-fbd4-b26bec9ddd0e" [ 805.179019] env[66583]: _type = "Task" [ 805.179019] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 805.186277] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]525a32f0-8bd9-92c1-fbd4-b26bec9ddd0e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 805.245192] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 805.245481] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 805.245710] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Deleting the datastore file [datastore2] a0bd3693-ed3f-4573-8250-85ae19a08869 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 805.245979] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6f45c832-bd13-4987-aa4e-ab6c58e1c884 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.251853] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Waiting for the task: (returnval){ [ 805.251853] env[66583]: value = "task-3470299" [ 805.251853] env[66583]: _type = "Task" [ 805.251853] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 805.260450] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Task: {'id': task-3470299, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 805.689311] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 805.689684] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Creating directory with path [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 805.690016] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61064332-ff39-461c-a4de-cebac9358fca {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.702063] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Created directory with path [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 805.702331] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Fetch image to [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 805.702552] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 805.703651] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0970ced-366e-416a-af09-48f78e696165 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.710634] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8511a0de-c66d-495c-886c-7af1ae29702b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.720204] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c0e931c-8825-48e8-acfb-e23817254cc3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.751512] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ab967ad-bacf-4a7d-b206-74ee7fd5c1c9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.763844] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aaae0a51-706e-4458-a510-35bf3e8c5791 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 805.765641] env[66583]: DEBUG oslo_vmware.api [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Task: {'id': task-3470299, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067961} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 805.765881] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 805.766617] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 805.766775] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 805.766956] env[66583]: INFO nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Took 0.60 seconds to destroy the instance on the hypervisor. [ 805.769464] env[66583]: DEBUG nova.compute.claims [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 805.769643] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 805.769862] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 805.789376] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 805.846146] env[66583]: DEBUG oslo_vmware.rw_handles [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 805.908857] env[66583]: DEBUG oslo_vmware.rw_handles [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 805.909129] env[66583]: DEBUG oslo_vmware.rw_handles [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 806.113582] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa928982-73cd-4041-93da-892dc845cde8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.121205] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45c91c45-bdc7-416a-af3f-cadb753c0539 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.150975] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-483c30a7-b5d4-4a27-98f8-3aa50ba342ab {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.157846] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-454a29ff-f2f6-4cc5-8ad8-02a5cfd68149 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 806.170809] env[66583]: DEBUG nova.compute.provider_tree [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 806.180396] env[66583]: DEBUG nova.scheduler.client.report [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 806.194018] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.424s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 806.194332] env[66583]: ERROR nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 806.194332] env[66583]: Faults: ['InvalidArgument'] [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Traceback (most recent call last): [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self.driver.spawn(context, instance, image_meta, [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self._vmops.spawn(context, instance, image_meta, injected_files, [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self._fetch_image_if_missing(context, vi) [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] image_cache(vi, tmp_image_ds_loc) [ 806.194332] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] vm_util.copy_virtual_disk( [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] session._wait_for_task(vmdk_copy_task) [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return self.wait_for_task(task_ref) [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return evt.wait() [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] result = hub.switch() [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] return self.greenlet.switch() [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 806.196094] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] self.f(*self.args, **self.kw) [ 806.196441] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 806.196441] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] raise exceptions.translate_fault(task_info.error) [ 806.196441] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 806.196441] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Faults: ['InvalidArgument'] [ 806.196441] env[66583]: ERROR nova.compute.manager [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] [ 806.196441] env[66583]: DEBUG nova.compute.utils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 806.196723] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Build of instance a0bd3693-ed3f-4573-8250-85ae19a08869 was re-scheduled: A specified parameter was not correct: fileType [ 806.196723] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 806.197160] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 806.197370] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 806.197538] env[66583]: DEBUG nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 806.197700] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 806.750165] env[66583]: DEBUG nova.network.neutron [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 806.765670] env[66583]: INFO nova.compute.manager [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] [instance: a0bd3693-ed3f-4573-8250-85ae19a08869] Took 0.57 seconds to deallocate network for instance. [ 806.888152] env[66583]: DEBUG oslo_concurrency.lockutils [None req-60c63938-68b9-428f-97d6-a211838d28d2 tempest-ServerDiagnosticsTest-1751289038 tempest-ServerDiagnosticsTest-1751289038-project-member] Lock "a0bd3693-ed3f-4573-8250-85ae19a08869" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.020s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 806.891871] env[66583]: Traceback (most recent call last): [ 806.891871] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 806.891871] env[66583]: self.driver.spawn(context, instance, image_meta, [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 806.891871] env[66583]: self._vmops.spawn(context, instance, image_meta, injected_files, [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 806.891871] env[66583]: self._fetch_image_if_missing(context, vi) [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 806.891871] env[66583]: image_cache(vi, tmp_image_ds_loc) [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 806.891871] env[66583]: vm_util.copy_virtual_disk( [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 806.891871] env[66583]: session._wait_for_task(vmdk_copy_task) [ 806.891871] env[66583]: File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 806.891871] env[66583]: return self.wait_for_task(task_ref) [ 806.891871] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 806.891871] env[66583]: return evt.wait() [ 806.891871] env[66583]: File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 806.891871] env[66583]: result = hub.switch() [ 806.891871] env[66583]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 806.891871] env[66583]: return self.greenlet.switch() [ 806.891871] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 806.891871] env[66583]: self.f(*self.args, **self.kw) [ 806.891871] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 806.891871] env[66583]: raise exceptions.translate_fault(task_info.error) [ 806.891871] env[66583]: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 806.891871] env[66583]: Faults: ['InvalidArgument'] [ 806.891871] env[66583]: During handling of the above exception, another exception occurred: [ 806.891871] env[66583]: Traceback (most recent call last): [ 806.891871] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 806.891871] env[66583]: self._build_and_run_instance(context, instance, image, [ 806.891871] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 2718, in _build_and_run_instance [ 806.892781] env[66583]: raise exception.RescheduledException( [ 806.892781] env[66583]: nova.exception.RescheduledException: Build of instance a0bd3693-ed3f-4573-8250-85ae19a08869 was re-scheduled: A specified parameter was not correct: fileType [ 806.892781] env[66583]: Faults: ['InvalidArgument'] [ 806.892781] env[66583]: During handling of the above exception, another exception occurred: [ 806.892781] env[66583]: Traceback (most recent call last): [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 806.892781] env[66583]: listener.cb(fileno) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 806.892781] env[66583]: return func(*args, **kwargs) [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 806.892781] env[66583]: return f(*args, **kwargs) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 2317, in _locked_do_build_and_run_instance [ 806.892781] env[66583]: result = self._do_build_and_run_instance(*args, **kwargs) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 806.892781] env[66583]: with excutils.save_and_reraise_exception(): [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.892781] env[66583]: self.force_reraise() [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.892781] env[66583]: raise self.value [ 806.892781] env[66583]: File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 806.892781] env[66583]: return f(self, context, *args, **kw) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 806.892781] env[66583]: with excutils.save_and_reraise_exception(): [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.892781] env[66583]: self.force_reraise() [ 806.892781] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.892781] env[66583]: raise self.value [ 806.892781] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 806.892781] env[66583]: return function(self, context, *args, **kwargs) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 806.892781] env[66583]: return function(self, context, *args, **kwargs) [ 806.892781] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 806.893594] env[66583]: return function(self, context, *args, **kwargs) [ 806.893594] env[66583]: File "/opt/stack/nova/nova/compute/manager.py", line 2461, in _do_build_and_run_instance [ 806.893594] env[66583]: instance.save() [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 806.893594] env[66583]: updates, result = self.indirection_api.object_action( [ 806.893594] env[66583]: File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 806.893594] env[66583]: return cctxt.call(context, 'object_action', objinst=objinst, [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 806.893594] env[66583]: result = self.transport._send( [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 806.893594] env[66583]: return self._driver.send(target, ctxt, message, [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 806.893594] env[66583]: return self._send(target, ctxt, message, wait_for_reply, timeout, [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 806.893594] env[66583]: raise result [ 806.893594] env[66583]: nova.exception_Remote.InstanceNotFound_Remote: Instance a0bd3693-ed3f-4573-8250-85ae19a08869 could not be found. [ 806.893594] env[66583]: Traceback (most recent call last): [ 806.893594] env[66583]: File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 806.893594] env[66583]: return getattr(target, method)(*args, **kwargs) [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 806.893594] env[66583]: return fn(self, *args, **kwargs) [ 806.893594] env[66583]: File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 806.893594] env[66583]: old_ref, inst_ref = db.instance_update_and_get_original( [ 806.893594] env[66583]: File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 806.893594] env[66583]: return f(*args, **kwargs) [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 806.893594] env[66583]: with excutils.save_and_reraise_exception() as ectxt: [ 806.893594] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 806.893594] env[66583]: self.force_reraise() [ 806.894377] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 806.894377] env[66583]: raise self.value [ 806.894377] env[66583]: File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 806.894377] env[66583]: return f(*args, **kwargs) [ 806.894377] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 806.894377] env[66583]: return f(context, *args, **kwargs) [ 806.894377] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 806.894377] env[66583]: instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 806.894377] env[66583]: File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 806.894377] env[66583]: raise exception.InstanceNotFound(instance_id=uuid) [ 806.894377] env[66583]: nova.exception.InstanceNotFound: Instance a0bd3693-ed3f-4573-8250-85ae19a08869 could not be found. [ 806.894377] env[66583]: Removing descriptor: 21 [ 806.901998] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 806.951517] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 806.951819] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 806.953275] env[66583]: INFO nova.compute.claims [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 807.237074] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4bb36362-1ef9-4a12-a9b8-e4642bf47384 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.244879] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d77c657f-fa77-4a04-9d7f-71305eb3709f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.275650] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4440da03-b647-45cc-8098-37e5aad488b7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.282731] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-408331a9-8742-4b17-a5a8-46738e4a88b8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.295980] env[66583]: DEBUG nova.compute.provider_tree [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 807.305954] env[66583]: DEBUG nova.scheduler.client.report [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 807.318628] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.367s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 807.319130] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 807.356335] env[66583]: DEBUG nova.compute.utils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 807.357638] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 807.357831] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 807.366767] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 807.424099] env[66583]: DEBUG nova.policy [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ff8d0b876d894ad786cfd4b4f8f0a5cd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ecf7a6e22f6b4ce78b4dd06bf5b1b80f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 807.455101] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 807.484103] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 807.484310] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 807.484587] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 807.484916] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 807.485179] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 807.485458] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 807.486122] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 807.486414] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 807.486714] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 807.487016] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 807.488179] env[66583]: DEBUG nova.virt.hardware [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 807.489493] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f65f3fb3-a23b-4613-b830-8fe209f1566b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 807.503626] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08e255bf-dd55-4155-8bcb-8f247c4378c6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 808.029482] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Successfully created port: 43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 808.563359] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.629565] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Successfully updated port: 43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 809.647845] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 809.647845] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquired lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 809.647845] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 809.677251] env[66583]: DEBUG nova.compute.manager [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Received event network-vif-plugged-43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 809.677307] env[66583]: DEBUG oslo_concurrency.lockutils [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] Acquiring lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 809.677545] env[66583]: DEBUG oslo_concurrency.lockutils [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] Lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 809.677635] env[66583]: DEBUG oslo_concurrency.lockutils [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] Lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 809.677810] env[66583]: DEBUG nova.compute.manager [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] No waiting events found dispatching network-vif-plugged-43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 809.677980] env[66583]: WARNING nova.compute.manager [req-08cd2a1b-21fe-42b0-8861-a845e2585fad req-6bd37a67-a0c6-497d-9214-0466c2b5f2a1 service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Received unexpected event network-vif-plugged-43d71ce7-f3f2-477a-908d-0f01e957e661 for instance with vm_state building and task_state spawning. [ 809.700665] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 809.983062] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Updating instance_info_cache with network_info: [{"id": "43d71ce7-f3f2-477a-908d-0f01e957e661", "address": "fa:16:3e:24:75:1c", "network": {"id": "dc7f26ea-0315-4138-acaf-9d7baaf6f27c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1197386722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ecf7a6e22f6b4ce78b4dd06bf5b1b80f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f246b87-f105-4b33-a71d-5caf8e99e074", "external-id": "nsx-vlan-transportzone-583", "segmentation_id": 583, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43d71ce7-f3", "ovs_interfaceid": "43d71ce7-f3f2-477a-908d-0f01e957e661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 809.997160] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Releasing lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 809.997493] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance network_info: |[{"id": "43d71ce7-f3f2-477a-908d-0f01e957e661", "address": "fa:16:3e:24:75:1c", "network": {"id": "dc7f26ea-0315-4138-acaf-9d7baaf6f27c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1197386722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ecf7a6e22f6b4ce78b4dd06bf5b1b80f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f246b87-f105-4b33-a71d-5caf8e99e074", "external-id": "nsx-vlan-transportzone-583", "segmentation_id": 583, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43d71ce7-f3", "ovs_interfaceid": "43d71ce7-f3f2-477a-908d-0f01e957e661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 809.997874] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:24:75:1c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0f246b87-f105-4b33-a71d-5caf8e99e074', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '43d71ce7-f3f2-477a-908d-0f01e957e661', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 810.006454] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Creating folder: Project (ecf7a6e22f6b4ce78b4dd06bf5b1b80f). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 810.007700] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e1b09611-b585-4ec6-b238-5f2ce8770c72 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.018096] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Created folder: Project (ecf7a6e22f6b4ce78b4dd06bf5b1b80f) in parent group-v693485. [ 810.018303] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Creating folder: Instances. Parent ref: group-v693539. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 810.018533] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1d9b0432-18bd-4118-8929-11550500f184 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.030569] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Created folder: Instances in parent group-v693539. [ 810.030809] env[66583]: DEBUG oslo.service.loopingcall [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 810.031059] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 810.031203] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f5c6fc11-05cc-48b1-bfcb-fff42294019e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.053333] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 810.053333] env[66583]: value = "task-3470302" [ 810.053333] env[66583]: _type = "Task" [ 810.053333] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 810.062432] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470302, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 810.566312] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470302, 'name': CreateVM_Task, 'duration_secs': 0.290942} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 810.566498] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 810.567203] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 810.567363] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 810.567689] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 810.567934] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2321dcc3-1ca6-4e94-89c8-9f81b34204d0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 810.573605] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Waiting for the task: (returnval){ [ 810.573605] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52b9d017-f02f-3ce8-a983-6ec569a36bfc" [ 810.573605] env[66583]: _type = "Task" [ 810.573605] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 810.581813] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52b9d017-f02f-3ce8-a983-6ec569a36bfc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 811.089025] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 811.089420] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 811.089582] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 811.976207] env[66583]: DEBUG nova.compute.manager [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Received event network-changed-43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 811.976418] env[66583]: DEBUG nova.compute.manager [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Refreshing instance network info cache due to event network-changed-43d71ce7-f3f2-477a-908d-0f01e957e661. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 811.976633] env[66583]: DEBUG oslo_concurrency.lockutils [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] Acquiring lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 811.976774] env[66583]: DEBUG oslo_concurrency.lockutils [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] Acquired lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 811.976932] env[66583]: DEBUG nova.network.neutron [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Refreshing network info cache for port 43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 812.439400] env[66583]: DEBUG nova.network.neutron [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Updated VIF entry in instance network info cache for port 43d71ce7-f3f2-477a-908d-0f01e957e661. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 812.439738] env[66583]: DEBUG nova.network.neutron [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Updating instance_info_cache with network_info: [{"id": "43d71ce7-f3f2-477a-908d-0f01e957e661", "address": "fa:16:3e:24:75:1c", "network": {"id": "dc7f26ea-0315-4138-acaf-9d7baaf6f27c", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1197386722-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "ecf7a6e22f6b4ce78b4dd06bf5b1b80f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0f246b87-f105-4b33-a71d-5caf8e99e074", "external-id": "nsx-vlan-transportzone-583", "segmentation_id": 583, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap43d71ce7-f3", "ovs_interfaceid": "43d71ce7-f3f2-477a-908d-0f01e957e661", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 812.449455] env[66583]: DEBUG oslo_concurrency.lockutils [req-38d6763a-04ee-4539-8601-dda1e1ca33f5 req-d69da22f-4ae7-42b6-a483-25238d19b61a service nova] Releasing lock "refresh_cache-83ac0082-b7fe-408d-9d5a-6e614ae7e61a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 813.569380] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "3816b87a-030d-4362-9596-bd0899455e52" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 820.843432] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.833679] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "e7664037-62b0-4195-b935-eab75d232f5d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.833956] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "e7664037-62b0-4195-b935-eab75d232f5d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 823.985753] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "89e32d26-aa13-4b13-9aec-9e35513946e8" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 854.656504] env[66583]: WARNING oslo_vmware.rw_handles [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 854.656504] env[66583]: ERROR oslo_vmware.rw_handles [ 854.657158] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 854.658837] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 854.659108] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Copying Virtual Disk [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/2bf64f60-6cb5-450a-8097-17ba03808a05/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 854.659423] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-14c23d2a-8c06-4e36-b97a-ccdb6305b501 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 854.667906] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for the task: (returnval){ [ 854.667906] env[66583]: value = "task-3470303" [ 854.667906] env[66583]: _type = "Task" [ 854.667906] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 854.675629] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Task: {'id': task-3470303, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 855.178325] env[66583]: DEBUG oslo_vmware.exceptions [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 855.178608] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 855.179151] env[66583]: ERROR nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 855.179151] env[66583]: Faults: ['InvalidArgument'] [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Traceback (most recent call last): [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] yield resources [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self.driver.spawn(context, instance, image_meta, [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self._fetch_image_if_missing(context, vi) [ 855.179151] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] image_cache(vi, tmp_image_ds_loc) [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] vm_util.copy_virtual_disk( [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] session._wait_for_task(vmdk_copy_task) [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return self.wait_for_task(task_ref) [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return evt.wait() [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] result = hub.switch() [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 855.179644] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return self.greenlet.switch() [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self.f(*self.args, **self.kw) [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] raise exceptions.translate_fault(task_info.error) [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Faults: ['InvalidArgument'] [ 855.179976] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] [ 855.179976] env[66583]: INFO nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Terminating instance [ 855.180950] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 855.181185] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 855.181420] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4d56fff2-7078-45ce-909d-fe69e7bc30f4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.183517] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 855.183706] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 855.184475] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d331ebdc-59ca-49a6-a969-df9de3723d5d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.191245] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 855.191485] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6db3ec97-1592-4c77-8533-bd233dfe677e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.193498] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 855.193683] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 855.194626] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-77af6535-8109-4853-b737-98f013d3ca1b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.199209] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for the task: (returnval){ [ 855.199209] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]522846a3-502b-c8b3-803e-78fadf318acc" [ 855.199209] env[66583]: _type = "Task" [ 855.199209] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 855.212868] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 855.213093] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Creating directory with path [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 855.213296] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e362562e-68fc-4e6d-8448-f847e94af8a6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.231979] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Created directory with path [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 855.232170] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Fetch image to [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 855.232345] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 855.233148] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fa52c91-5b66-4d22-989d-26aea3faaa62 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.240231] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12eff1fa-6f49-4139-8f78-0c42f8d92e2d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.248989] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca219ed6-a789-4a7f-9241-0d535f9dd35f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.254346] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 855.254546] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 855.254726] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Deleting the datastore file [datastore2] da4dff27-123e-44ac-83b5-1b2b3d731e0a {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 855.254939] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1c6e2d40-c1bb-4dd2-b13c-aee3ffa16ca7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.282911] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-275497c4-39e4-4a6a-8a98-2f36b0cab628 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.285643] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for the task: (returnval){ [ 855.285643] env[66583]: value = "task-3470305" [ 855.285643] env[66583]: _type = "Task" [ 855.285643] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 855.290659] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aa07399f-b230-42ef-a4b9-bd8c6af8de87 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 855.294977] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Task: {'id': task-3470305, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 855.313487] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 855.359590] env[66583]: DEBUG oslo_vmware.rw_handles [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 855.416919] env[66583]: DEBUG oslo_vmware.rw_handles [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 855.417154] env[66583]: DEBUG oslo_vmware.rw_handles [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 855.796051] env[66583]: DEBUG oslo_vmware.api [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Task: {'id': task-3470305, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067425} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 855.796571] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 855.796571] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 855.796571] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 855.796817] env[66583]: INFO nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 855.799822] env[66583]: DEBUG nova.compute.claims [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 855.800261] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 855.800261] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.057807] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bdcea84-6d5c-4f41-8f7e-ee24696764fe {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.065167] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6953cd74-ffbd-423b-8ce3-2e632488d895 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.095079] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd0a15f-670b-4017-bd05-b2f3fd11028c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.102331] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0845eba0-b1de-4dce-a6a1-d79c4aff2f6d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.114860] env[66583]: DEBUG nova.compute.provider_tree [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 856.123842] env[66583]: DEBUG nova.scheduler.client.report [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 856.137063] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.337s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.137607] env[66583]: ERROR nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 856.137607] env[66583]: Faults: ['InvalidArgument'] [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Traceback (most recent call last): [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self.driver.spawn(context, instance, image_meta, [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self._fetch_image_if_missing(context, vi) [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] image_cache(vi, tmp_image_ds_loc) [ 856.137607] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] vm_util.copy_virtual_disk( [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] session._wait_for_task(vmdk_copy_task) [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return self.wait_for_task(task_ref) [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return evt.wait() [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] result = hub.switch() [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] return self.greenlet.switch() [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 856.137971] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] self.f(*self.args, **self.kw) [ 856.138370] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 856.138370] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] raise exceptions.translate_fault(task_info.error) [ 856.138370] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 856.138370] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Faults: ['InvalidArgument'] [ 856.138370] env[66583]: ERROR nova.compute.manager [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] [ 856.138370] env[66583]: DEBUG nova.compute.utils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 856.140134] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Build of instance da4dff27-123e-44ac-83b5-1b2b3d731e0a was re-scheduled: A specified parameter was not correct: fileType [ 856.140134] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 856.140134] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 856.140243] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 856.140346] env[66583]: DEBUG nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 856.140502] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 856.515074] env[66583]: DEBUG nova.network.neutron [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 856.528679] env[66583]: INFO nova.compute.manager [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Took 0.39 seconds to deallocate network for instance. [ 856.614470] env[66583]: INFO nova.scheduler.client.report [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Deleted allocations for instance da4dff27-123e-44ac-83b5-1b2b3d731e0a [ 856.629645] env[66583]: DEBUG oslo_concurrency.lockutils [None req-2f3a08b6-e655-4612-80c3-9a6442ea713b tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 247.211s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.631438] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 48.068s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.631711] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Acquiring lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.631887] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.632082] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.633971] env[66583]: INFO nova.compute.manager [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Terminating instance [ 856.635655] env[66583]: DEBUG nova.compute.manager [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 856.635849] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 856.636303] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-73cce138-7e2d-4f9b-b75a-4208bf3bcba0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.645057] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0ffdb39-b642-40a3-9927-0a914f3e2dfa {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.655532] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 856.676804] env[66583]: WARNING nova.virt.vmwareapi.vmops [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance da4dff27-123e-44ac-83b5-1b2b3d731e0a could not be found. [ 856.677012] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 856.677195] env[66583]: INFO nova.compute.manager [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 856.677420] env[66583]: DEBUG oslo.service.loopingcall [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 856.677750] env[66583]: DEBUG nova.compute.manager [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 856.677750] env[66583]: DEBUG nova.network.neutron [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 856.698164] env[66583]: DEBUG nova.network.neutron [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 856.701471] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 856.701696] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 856.703161] env[66583]: INFO nova.compute.claims [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 856.706234] env[66583]: INFO nova.compute.manager [-] [instance: da4dff27-123e-44ac-83b5-1b2b3d731e0a] Took 0.03 seconds to deallocate network for instance. [ 856.814039] env[66583]: DEBUG oslo_concurrency.lockutils [None req-df2aa1ba-1dd7-457d-a1ec-3711a51ed2b5 tempest-ImagesNegativeTestJSON-2005941781 tempest-ImagesNegativeTestJSON-2005941781-project-member] Lock "da4dff27-123e-44ac-83b5-1b2b3d731e0a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.183s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 856.967243] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e1e5d28-b9a6-4080-ae1d-e2db81342cb1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 856.975216] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1063a600-dd8a-4193-979d-e4b1b44a67a2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.005043] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a43184c-ac25-40b2-b5ce-e808a7831d08 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.011916] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-872bf6f2-9ce9-441f-a77b-7b03cbb6253a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.025672] env[66583]: DEBUG nova.compute.provider_tree [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 857.037623] env[66583]: DEBUG nova.scheduler.client.report [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 857.051358] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 857.051752] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 857.086164] env[66583]: DEBUG nova.compute.utils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 857.088395] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 857.088582] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 857.097515] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 857.132335] env[66583]: INFO nova.virt.block_device [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Booting with volume d7fea663-747f-4be9-83d3-8b31cc2fc950 at /dev/sda [ 857.175866] env[66583]: DEBUG nova.policy [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e666af4b38654bb0842756306c5d4c33', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '723d64146c864441b408b12188438004', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 857.178823] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a0acb662-3e77-402c-8114-35b444a78b6d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.186841] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b35f98cd-f710-411c-ae4d-f7bf45f56505 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.214095] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-9ad2841c-4847-4f79-b317-ad3caa6a5c83 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.221588] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb261918-8603-428a-961f-013e16795702 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.249896] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f2e7534-6742-43c8-ad36-7bd642aa6f06 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.257369] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0972b06f-bd31-47ce-ab5d-ad9b97692992 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.271224] env[66583]: DEBUG nova.virt.block_device [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updating existing volume attachment record: 3c1320cb-2c68-4cd3-af6d-4f5b47a51c1a {{(pid=66583) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 857.498475] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 857.499041] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 857.499649] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 857.499649] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 857.499649] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 857.500147] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 857.500147] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 857.500249] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 857.500425] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 857.500923] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 857.500923] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 857.500923] env[66583]: DEBUG nova.virt.hardware [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 857.502009] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-349169c1-8625-44c7-b59b-68398e82fa27 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 857.510901] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f829325-8a5b-46ab-8569-4aa662e7496d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 858.028597] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Successfully created port: 210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 858.846375] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 858.846575] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Cleaning up deleted instances {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 858.863591] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] There are 1 instances to clean {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 858.863885] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 858.907588] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 858.907783] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Cleaning up deleted instances with incomplete migration {{(pid=66583) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 858.918485] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 859.068175] env[66583]: DEBUG nova.compute.manager [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Received event network-vif-plugged-210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 859.068472] env[66583]: DEBUG oslo_concurrency.lockutils [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] Acquiring lock "63244459-f37b-4fdb-8afc-9e4a80156099-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 859.068603] env[66583]: DEBUG oslo_concurrency.lockutils [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] Lock "63244459-f37b-4fdb-8afc-9e4a80156099-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 859.068763] env[66583]: DEBUG oslo_concurrency.lockutils [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] Lock "63244459-f37b-4fdb-8afc-9e4a80156099-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 859.068915] env[66583]: DEBUG nova.compute.manager [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] No waiting events found dispatching network-vif-plugged-210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 859.069091] env[66583]: WARNING nova.compute.manager [req-fb0e9068-ca81-4652-8699-92d27211067f req-9c9ef212-50dd-40ea-a54a-2f7e474d10d8 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Received unexpected event network-vif-plugged-210b6f54-f21f-4204-ae34-04cfecde6270 for instance with vm_state building and task_state spawning. [ 859.122701] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Successfully updated port: 210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 859.134559] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Acquiring lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 859.134728] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Acquired lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 859.134883] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 859.215948] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 859.525915] env[66583]: DEBUG nova.network.neutron [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updating instance_info_cache with network_info: [{"id": "210b6f54-f21f-4204-ae34-04cfecde6270", "address": "fa:16:3e:56:2e:75", "network": {"id": "73a5d540-a79e-45bd-96f7-be2e6dd3fd08", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1062980772-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "723d64146c864441b408b12188438004", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap210b6f54-f2", "ovs_interfaceid": "210b6f54-f21f-4204-ae34-04cfecde6270", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 859.540528] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Releasing lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 859.540528] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Instance network_info: |[{"id": "210b6f54-f21f-4204-ae34-04cfecde6270", "address": "fa:16:3e:56:2e:75", "network": {"id": "73a5d540-a79e-45bd-96f7-be2e6dd3fd08", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1062980772-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "723d64146c864441b408b12188438004", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap210b6f54-f2", "ovs_interfaceid": "210b6f54-f21f-4204-ae34-04cfecde6270", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 859.540957] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:56:2e:75', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '113aa98d-90ca-43bc-a534-8908d1ec7d15', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '210b6f54-f21f-4204-ae34-04cfecde6270', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 859.550778] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Creating folder: Project (723d64146c864441b408b12188438004). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 859.551497] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41f1b0ba-c63c-4df9-b058-dc235090e19b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.565599] env[66583]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 859.565827] env[66583]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=66583) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 859.566161] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Folder already exists: Project (723d64146c864441b408b12188438004). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 859.566366] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Creating folder: Instances. Parent ref: group-v693513. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 859.566839] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e0b3b64-a53d-43e3-a633-76ed872212ea {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.578205] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Created folder: Instances in parent group-v693513. [ 859.578447] env[66583]: DEBUG oslo.service.loopingcall [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 859.578631] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 859.578830] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a7c09776-7bf4-4d6e-b74e-e4a13cd84097 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 859.598091] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 859.598091] env[66583]: value = "task-3470308" [ 859.598091] env[66583]: _type = "Task" [ 859.598091] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 859.607385] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470308, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 860.108082] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470308, 'name': CreateVM_Task, 'duration_secs': 0.285709} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 860.108354] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 860.108905] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/sda', 'attachment_id': '3c1320cb-2c68-4cd3-af6d-4f5b47a51c1a', 'delete_on_termination': True, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693516', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'name': 'volume-d7fea663-747f-4be9-83d3-8b31cc2fc950', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '63244459-f37b-4fdb-8afc-9e4a80156099', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'serial': 'd7fea663-747f-4be9-83d3-8b31cc2fc950'}, 'device_type': None, 'boot_index': 0, 'guest_format': None, 'disk_bus': None, 'volume_type': None}], 'swap': None} {{(pid=66583) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 860.113021] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Root volume attach. Driver type: vmdk {{(pid=66583) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 860.113021] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7c57e3e-a310-4292-8b71-1535a3818ecb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.118102] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e57e94c-9a66-431b-8cb0-a9497854f970 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.124721] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-03904792-a779-476d-ab9e-cc33fcbd80ff {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.131218] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-8a8a0367-41a4-45b1-abb7-1a46cd7cd636 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 860.138513] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 860.138513] env[66583]: value = "task-3470309" [ 860.138513] env[66583]: _type = "Task" [ 860.138513] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 860.146407] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 860.652923] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 42%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 860.926027] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 861.152687] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 54%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 861.222668] env[66583]: DEBUG nova.compute.manager [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Received event network-changed-210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 861.222911] env[66583]: DEBUG nova.compute.manager [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Refreshing instance network info cache due to event network-changed-210b6f54-f21f-4204-ae34-04cfecde6270. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 861.223370] env[66583]: DEBUG oslo_concurrency.lockutils [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] Acquiring lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 861.223582] env[66583]: DEBUG oslo_concurrency.lockutils [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] Acquired lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 861.223815] env[66583]: DEBUG nova.network.neutron [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Refreshing network info cache for port 210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 861.653950] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 69%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 861.757655] env[66583]: DEBUG nova.network.neutron [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updated VIF entry in instance network info cache for port 210b6f54-f21f-4204-ae34-04cfecde6270. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 861.758054] env[66583]: DEBUG nova.network.neutron [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updating instance_info_cache with network_info: [{"id": "210b6f54-f21f-4204-ae34-04cfecde6270", "address": "fa:16:3e:56:2e:75", "network": {"id": "73a5d540-a79e-45bd-96f7-be2e6dd3fd08", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1062980772-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "723d64146c864441b408b12188438004", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap210b6f54-f2", "ovs_interfaceid": "210b6f54-f21f-4204-ae34-04cfecde6270", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 861.774891] env[66583]: DEBUG oslo_concurrency.lockutils [req-813c4ecb-b901-462d-a736-9fc20fcce365 req-d44524a2-4956-467a-868b-ed61e63241d5 service nova] Releasing lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 861.846707] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 862.151600] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 84%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 862.653170] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 97%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.154911] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task} progress is 98%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.660139] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470309, 'name': RelocateVM_Task, 'duration_secs': 3.314164} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 863.660139] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Volume attach. Driver type: vmdk {{(pid=66583) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 863.660139] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693516', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'name': 'volume-d7fea663-747f-4be9-83d3-8b31cc2fc950', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '63244459-f37b-4fdb-8afc-9e4a80156099', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'serial': 'd7fea663-747f-4be9-83d3-8b31cc2fc950'} {{(pid=66583) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 863.660484] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ff909ad-aab9-40ae-aff7-caaa33a9d037 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.688018] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05a3de98-5eed-471a-8d92-b2bd2f8f8c90 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.711863] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Reconfiguring VM instance instance-00000011 to attach disk [datastore2] volume-d7fea663-747f-4be9-83d3-8b31cc2fc950/volume-d7fea663-747f-4be9-83d3-8b31cc2fc950.vmdk or device None with type thin {{(pid=66583) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 863.712231] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-6cadb900-dfad-4233-83e2-6e4df4f3b3ee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 863.734318] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 863.734318] env[66583]: value = "task-3470310" [ 863.734318] env[66583]: _type = "Task" [ 863.734318] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 863.744696] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470310, 'name': ReconfigVM_Task} progress is 6%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 863.832097] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "504d18e4-8457-431b-b6cb-b26a0c64b14b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 863.832475] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 863.846714] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 863.846927] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.245198] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470310, 'name': ReconfigVM_Task, 'duration_secs': 0.265896} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 864.245477] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Reconfigured VM instance instance-00000011 to attach disk [datastore2] volume-d7fea663-747f-4be9-83d3-8b31cc2fc950/volume-d7fea663-747f-4be9-83d3-8b31cc2fc950.vmdk or device None with type thin {{(pid=66583) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 864.250971] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-254f57c6-66dd-47f0-a3e1-ce7b45ca822a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.264822] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 864.264822] env[66583]: value = "task-3470311" [ 864.264822] env[66583]: _type = "Task" [ 864.264822] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 864.272517] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470311, 'name': ReconfigVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 864.775386] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470311, 'name': ReconfigVM_Task, 'duration_secs': 0.124252} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 864.775683] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693516', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'name': 'volume-d7fea663-747f-4be9-83d3-8b31cc2fc950', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '63244459-f37b-4fdb-8afc-9e4a80156099', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'serial': 'd7fea663-747f-4be9-83d3-8b31cc2fc950'} {{(pid=66583) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 864.776337] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-3d3708a8-6010-4090-95b8-332beacc1c5e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.782682] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 864.782682] env[66583]: value = "task-3470312" [ 864.782682] env[66583]: _type = "Task" [ 864.782682] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 864.790678] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470312, 'name': Rename_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 864.846641] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.846641] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 864.846641] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 864.876948] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877129] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877266] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877397] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877533] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877666] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877842] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.877996] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.878167] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 864.878323] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 864.878857] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.879082] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.879248] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 864.879433] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 864.890011] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 864.890238] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 864.890431] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 864.890640] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 864.891729] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38c5207a-bf5d-4325-9a2e-242304fe9492 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.900354] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc0d9b1c-5367-4eaa-8047-daeedb1ad5bf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.914932] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-493b7c0b-996e-4d84-8484-f2a85f003adb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.921527] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6493f7a6-cd34-428c-bdc9-3c30fef9502a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 864.953483] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180927MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 864.953483] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 864.953723] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.070131] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 3816b87a-030d-4362-9596-bd0899455e52 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070299] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070456] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070595] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070718] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070836] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.070955] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 9915557d-4251-44a2-bf59-3dd542dfb527 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.071232] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.071443] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 63244459-f37b-4fdb-8afc-9e4a80156099 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 865.083717] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a14582eb-f78f-44d6-8c82-16976c0cec5b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.094219] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 408735e7-0c1b-406e-b72d-8a0396830264 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.104029] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 87acbe03-624d-454c-b108-0566ca0d750e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.113593] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance a5fa8d3d-ad60-4749-bba1-0e00538a543f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.123025] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 0a575fbd-2390-401a-8df0-47a40e187c87 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.132873] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 1a9f02ca-7220-490c-81ed-bf2422173315 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.142662] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 2f03a941-3722-4df8-af76-3bd073f8927b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.152696] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance d04a1c66-b45e-4266-9e98-2682f7fa42d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.161962] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance b3cb9c35-714c-4ce5-b826-0c8398ed93b9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.172656] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e7664037-62b0-4195-b935-eab75d232f5d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.182517] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 504d18e4-8457-431b-b6cb-b26a0c64b14b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 865.182747] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 865.182889] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 865.198960] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing inventories for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 865.213029] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Updating ProviderTree inventory for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 865.213231] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Updating inventory in ProviderTree for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 865.223751] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing aggregate associations for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc, aggregates: None {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 865.239311] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing trait associations for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 865.294395] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470312, 'name': Rename_Task, 'duration_secs': 0.162624} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 865.294682] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Powering on the VM {{(pid=66583) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 865.294922] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-9138dde2-0b57-472d-a584-04f515ecbc98 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.300896] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 865.300896] env[66583]: value = "task-3470313" [ 865.300896] env[66583]: _type = "Task" [ 865.300896] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 865.308356] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470313, 'name': PowerOnVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 865.483542] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9e8ce70-af4b-468d-ad4e-bd99b4838690 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.491813] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8a78b5e-a43b-45dc-a78c-1073fb2131e6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.522229] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9395476e-e462-4297-a929-55751d0ac735 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.529919] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8632cbb-e370-4ea4-8aa0-b7b42cbefb37 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.543538] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 865.553023] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 865.567599] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 865.567599] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 865.811906] env[66583]: DEBUG oslo_vmware.api [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470313, 'name': PowerOnVM_Task, 'duration_secs': 0.440392} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 865.812292] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Powered on the VM {{(pid=66583) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 865.812507] env[66583]: INFO nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Took 8.31 seconds to spawn the instance on the hypervisor. [ 865.812756] env[66583]: DEBUG nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Checking state {{(pid=66583) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 865.813536] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c94320b-d383-4d10-b2b1-a2e90089dd40 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 865.874845] env[66583]: INFO nova.compute.manager [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Took 9.19 seconds to build instance. [ 865.885937] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b644c54f-b0e2-4ea2-a03b-f9706fc09eb0 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "63244459-f37b-4fdb-8afc-9e4a80156099" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 165.306s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 865.895862] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 865.939563] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 865.939817] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 865.941247] env[66583]: INFO nova.compute.claims [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 866.198228] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ba7c67c-0723-4306-991d-deaa01b15557 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.205912] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ba4b2a-a64a-46f3-8f27-c958d5fd1cf0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.236132] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d066f2d6-8608-41fc-983a-a5e96eaf3d64 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.246013] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c6ea47d-8545-447f-9ba5-5dbd0cbf5650 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.259012] env[66583]: DEBUG nova.compute.provider_tree [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 866.267787] env[66583]: DEBUG nova.scheduler.client.report [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 866.283644] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.344s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 866.284171] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 866.317065] env[66583]: DEBUG nova.compute.utils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 866.318467] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 866.318635] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 866.328714] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 866.391897] env[66583]: DEBUG nova.policy [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b72fc5beb6c47c6acb056f1f4ed9eec', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd95fe3734183455b8eadbae4ca72e2d9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 866.396026] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 866.422578] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 866.422832] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 866.422991] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 866.423195] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 866.423355] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 866.423488] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 866.423698] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 866.423854] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 866.424031] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 866.424246] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 866.424449] env[66583]: DEBUG nova.virt.hardware [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 866.425355] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-042c39b4-f36b-4de8-9978-90e7a769e375 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.433713] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9dff24c-eb58-4106-b72b-b3b084e25106 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 866.563954] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 866.926704] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Successfully created port: 2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 867.135359] env[66583]: DEBUG nova.compute.manager [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Received event network-changed-210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 867.135553] env[66583]: DEBUG nova.compute.manager [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Refreshing instance network info cache due to event network-changed-210b6f54-f21f-4204-ae34-04cfecde6270. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 867.135774] env[66583]: DEBUG oslo_concurrency.lockutils [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] Acquiring lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 867.135870] env[66583]: DEBUG oslo_concurrency.lockutils [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] Acquired lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 867.136086] env[66583]: DEBUG nova.network.neutron [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Refreshing network info cache for port 210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 868.034566] env[66583]: DEBUG nova.network.neutron [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updated VIF entry in instance network info cache for port 210b6f54-f21f-4204-ae34-04cfecde6270. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 868.034972] env[66583]: DEBUG nova.network.neutron [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Updating instance_info_cache with network_info: [{"id": "210b6f54-f21f-4204-ae34-04cfecde6270", "address": "fa:16:3e:56:2e:75", "network": {"id": "73a5d540-a79e-45bd-96f7-be2e6dd3fd08", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1062980772-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.163", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "723d64146c864441b408b12188438004", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "113aa98d-90ca-43bc-a534-8908d1ec7d15", "external-id": "nsx-vlan-transportzone-186", "segmentation_id": 186, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap210b6f54-f2", "ovs_interfaceid": "210b6f54-f21f-4204-ae34-04cfecde6270", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 868.050386] env[66583]: DEBUG oslo_concurrency.lockutils [req-8ed1ba6e-0c57-4576-8b44-e41ec582d005 req-ecb17e19-defe-44f4-911d-dc91c58faa2e service nova] Releasing lock "refresh_cache-63244459-f37b-4fdb-8afc-9e4a80156099" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 868.484922] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Successfully updated port: 2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 868.494033] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 868.494295] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquired lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 868.494530] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 868.570292] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 868.640709] env[66583]: DEBUG nova.compute.manager [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Received event network-vif-plugged-2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 868.640942] env[66583]: DEBUG oslo_concurrency.lockutils [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] Acquiring lock "a14582eb-f78f-44d6-8c82-16976c0cec5b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 868.641166] env[66583]: DEBUG oslo_concurrency.lockutils [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] Lock "a14582eb-f78f-44d6-8c82-16976c0cec5b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 868.641334] env[66583]: DEBUG oslo_concurrency.lockutils [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] Lock "a14582eb-f78f-44d6-8c82-16976c0cec5b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 868.641504] env[66583]: DEBUG nova.compute.manager [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] No waiting events found dispatching network-vif-plugged-2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 868.642294] env[66583]: WARNING nova.compute.manager [req-3db13902-6d54-4367-ad73-6061e2b47731 req-dde7396d-c453-4351-acf2-748fe72034ee service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Received unexpected event network-vif-plugged-2708f533-5245-4118-bacf-0b47cd5d8cc3 for instance with vm_state building and task_state spawning. [ 868.902589] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Updating instance_info_cache with network_info: [{"id": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "address": "fa:16:3e:1d:72:c6", "network": {"id": "154fbd61-777b-463d-ac97-f618e8c6ea38", "bridge": "br-int", "label": "tempest-ServersTestJSON-691636788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d95fe3734183455b8eadbae4ca72e2d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0e00b2f1-c70f-4b21-86eb-810643cc1680", "external-id": "nsx-vlan-transportzone-487", "segmentation_id": 487, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2708f533-52", "ovs_interfaceid": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 868.924892] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Releasing lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 868.925236] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance network_info: |[{"id": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "address": "fa:16:3e:1d:72:c6", "network": {"id": "154fbd61-777b-463d-ac97-f618e8c6ea38", "bridge": "br-int", "label": "tempest-ServersTestJSON-691636788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d95fe3734183455b8eadbae4ca72e2d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0e00b2f1-c70f-4b21-86eb-810643cc1680", "external-id": "nsx-vlan-transportzone-487", "segmentation_id": 487, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2708f533-52", "ovs_interfaceid": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 868.925664] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:72:c6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0e00b2f1-c70f-4b21-86eb-810643cc1680', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2708f533-5245-4118-bacf-0b47cd5d8cc3', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 868.933249] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Creating folder: Project (d95fe3734183455b8eadbae4ca72e2d9). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 868.933833] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-294c6db3-4ce8-4a08-8adb-7e55d1385dde {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.945256] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Created folder: Project (d95fe3734183455b8eadbae4ca72e2d9) in parent group-v693485. [ 868.945480] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Creating folder: Instances. Parent ref: group-v693544. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 868.945721] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5a53ae2b-dd26-46b4-bc7b-afaeb367d7da {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.954766] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Created folder: Instances in parent group-v693544. [ 868.954995] env[66583]: DEBUG oslo.service.loopingcall [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 868.955272] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 868.955488] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-eb58f5d1-c1bd-4d8b-9015-e3f2b35b7b5a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 868.975097] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 868.975097] env[66583]: value = "task-3470316" [ 868.975097] env[66583]: _type = "Task" [ 868.975097] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 868.982930] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470316, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 869.485278] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470316, 'name': CreateVM_Task, 'duration_secs': 0.36099} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 869.485552] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 869.486143] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 869.486314] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 869.486622] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 869.486864] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1450126d-f632-4fb6-85f6-21fa425f0aeb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.491331] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Waiting for the task: (returnval){ [ 869.491331] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52d5774f-3ceb-753c-39ec-c9a7e7b1f98c" [ 869.491331] env[66583]: _type = "Task" [ 869.491331] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 869.500294] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52d5774f-3ceb-753c-39ec-c9a7e7b1f98c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 870.001079] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 870.001335] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 870.001544] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.768884] env[66583]: DEBUG nova.compute.manager [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Received event network-changed-2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 870.769157] env[66583]: DEBUG nova.compute.manager [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Refreshing instance network info cache due to event network-changed-2708f533-5245-4118-bacf-0b47cd5d8cc3. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 870.769320] env[66583]: DEBUG oslo_concurrency.lockutils [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] Acquiring lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 870.769481] env[66583]: DEBUG oslo_concurrency.lockutils [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] Acquired lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 870.769700] env[66583]: DEBUG nova.network.neutron [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Refreshing network info cache for port 2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 871.127198] env[66583]: DEBUG nova.network.neutron [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Updated VIF entry in instance network info cache for port 2708f533-5245-4118-bacf-0b47cd5d8cc3. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 871.127557] env[66583]: DEBUG nova.network.neutron [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Updating instance_info_cache with network_info: [{"id": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "address": "fa:16:3e:1d:72:c6", "network": {"id": "154fbd61-777b-463d-ac97-f618e8c6ea38", "bridge": "br-int", "label": "tempest-ServersTestJSON-691636788-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d95fe3734183455b8eadbae4ca72e2d9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0e00b2f1-c70f-4b21-86eb-810643cc1680", "external-id": "nsx-vlan-transportzone-487", "segmentation_id": 487, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2708f533-52", "ovs_interfaceid": "2708f533-5245-4118-bacf-0b47cd5d8cc3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.136593] env[66583]: DEBUG oslo_concurrency.lockutils [req-18499eec-86b5-46b0-934a-a5835fa300aa req-d94b4b78-5e0e-4867-9d6d-a27554f6864d service nova] Releasing lock "refresh_cache-a14582eb-f78f-44d6-8c82-16976c0cec5b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 885.528051] env[66583]: INFO nova.compute.manager [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Rebuilding instance [ 885.560389] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lazy-loading 'trusted_certs' on Instance uuid 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 885.572458] env[66583]: DEBUG nova.compute.manager [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Checking state {{(pid=66583) _get_power_state /opt/stack/nova/nova/compute/manager.py:1762}} [ 885.573309] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23668f95-b09b-44cb-a184-aa857ad6b59c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.626059] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lazy-loading 'pci_requests' on Instance uuid 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 885.636249] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lazy-loading 'pci_devices' on Instance uuid 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 885.665667] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lazy-loading 'resources' on Instance uuid 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 885.674770] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lazy-loading 'migration_context' on Instance uuid 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1105}} [ 885.681792] env[66583]: DEBUG nova.objects.instance [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Trying to apply a migration context that does not seem to be set for this instance {{(pid=66583) apply_migration_context /opt/stack/nova/nova/objects/instance.py:1032}} [ 885.682261] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Powering off the VM {{(pid=66583) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 885.682573] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-9edc8bb3-fdea-4414-a163-9a93308fdce6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 885.690415] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 885.690415] env[66583]: value = "task-3470317" [ 885.690415] env[66583]: _type = "Task" [ 885.690415] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 885.698626] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470317, 'name': PowerOffVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 886.200555] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470317, 'name': PowerOffVM_Task, 'duration_secs': 0.166924} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 886.200776] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Powered off the VM {{(pid=66583) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 886.201493] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Powering off the VM {{(pid=66583) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 886.201995] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-77a341bf-d104-41de-b22b-65b37d977ccc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.207450] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 886.207450] env[66583]: value = "task-3470318" [ 886.207450] env[66583]: _type = "Task" [ 886.207450] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 886.214456] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470318, 'name': PowerOffVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 886.718447] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] VM already powered off {{(pid=66583) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 886.718720] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Volume detach. Driver type: vmdk {{(pid=66583) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 886.718891] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693516', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'name': 'volume-d7fea663-747f-4be9-83d3-8b31cc2fc950', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '63244459-f37b-4fdb-8afc-9e4a80156099', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'serial': 'd7fea663-747f-4be9-83d3-8b31cc2fc950'} {{(pid=66583) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 886.719658] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c89104a-f3bd-4896-b633-407e001fb6ad {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.738532] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ddf2af91-a95c-4fa8-98af-e3df2d4a5faa {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.745167] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60fdfe75-4919-48ae-bc83-3f523bdda6b3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.763042] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8709fe5-4101-465e-bffa-adf57294aba5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.778674] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] The volume has not been displaced from its original location: [datastore2] volume-d7fea663-747f-4be9-83d3-8b31cc2fc950/volume-d7fea663-747f-4be9-83d3-8b31cc2fc950.vmdk. No consolidation needed. {{(pid=66583) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 886.783812] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Reconfiguring VM instance instance-00000011 to detach disk 2000 {{(pid=66583) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 886.784119] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-4b9992a9-df20-487c-b767-90f7f559155f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.802790] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 886.802790] env[66583]: value = "task-3470319" [ 886.802790] env[66583]: _type = "Task" [ 886.802790] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 886.810818] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470319, 'name': ReconfigVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 887.312436] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470319, 'name': ReconfigVM_Task, 'duration_secs': 0.179519} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 887.312735] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Reconfigured VM instance instance-00000011 to detach disk 2000 {{(pid=66583) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 887.317233] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-15586173-4f3a-49c6-9b5c-651af0520bc6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.331627] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 887.331627] env[66583]: value = "task-3470320" [ 887.331627] env[66583]: _type = "Task" [ 887.331627] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 887.339116] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470320, 'name': ReconfigVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 887.841526] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470320, 'name': ReconfigVM_Task, 'duration_secs': 0.0974} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 887.841828] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-693516', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'name': 'volume-d7fea663-747f-4be9-83d3-8b31cc2fc950', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '63244459-f37b-4fdb-8afc-9e4a80156099', 'attached_at': '', 'detached_at': '', 'volume_id': 'd7fea663-747f-4be9-83d3-8b31cc2fc950', 'serial': 'd7fea663-747f-4be9-83d3-8b31cc2fc950'} {{(pid=66583) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 887.846188] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 887.846999] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e50065f1-34b6-4844-8b4a-cefef6607922 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.853468] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 887.853688] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7918dd8d-9fce-40a6-b3c4-91ab2e0e7cf8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.911795] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 887.912034] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 887.912226] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Deleting the datastore file [datastore2] 63244459-f37b-4fdb-8afc-9e4a80156099 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 887.912511] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9f9a03dd-cdf5-48da-a3d3-a51ee87cc28b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 887.919200] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Waiting for the task: (returnval){ [ 887.919200] env[66583]: value = "task-3470322" [ 887.919200] env[66583]: _type = "Task" [ 887.919200] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 888.430796] env[66583]: DEBUG oslo_vmware.api [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Task: {'id': task-3470322, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079461} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 888.431073] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 888.431273] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 888.431455] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 888.484027] env[66583]: DEBUG nova.virt.vmwareapi.volumeops [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Volume detach. Driver type: vmdk {{(pid=66583) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 888.484539] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-211387e9-fb28-4a6c-b6a5-43ca3b4cc346 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.492380] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bb1d014-7a4d-4e78-a292-d9017b893901 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.524326] env[66583]: ERROR nova.compute.manager [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Failed to detach volume d7fea663-747f-4be9-83d3-8b31cc2fc950 from /dev/sda: nova.exception.InstanceNotFound: Instance 63244459-f37b-4fdb-8afc-9e4a80156099 could not be found. [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Traceback (most recent call last): [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self.driver.rebuild(**kwargs) [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise NotImplementedError() [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] NotImplementedError [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] During handling of the above exception, another exception occurred: [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Traceback (most recent call last): [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 888.524326] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self.driver.detach_volume(context, old_connection_info, [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] return self._volumeops.detach_volume(connection_info, instance) [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._detach_volume_vmdk(connection_info, instance) [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] stable_ref.fetch_moref(session) [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise exception.InstanceNotFound(instance_id=self._uuid) [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] nova.exception.InstanceNotFound: Instance 63244459-f37b-4fdb-8afc-9e4a80156099 could not be found. [ 888.524797] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.653647] env[66583]: DEBUG nova.compute.utils [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Build of instance 63244459-f37b-4fdb-8afc-9e4a80156099 aborted: Failed to rebuild volume backed instance. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 888.656198] env[66583]: ERROR nova.compute.manager [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance 63244459-f37b-4fdb-8afc-9e4a80156099 aborted: Failed to rebuild volume backed instance. [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Traceback (most recent call last): [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 4100, in _do_rebuild_instance [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self.driver.rebuild(**kwargs) [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/driver.py", line 378, in rebuild [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise NotImplementedError() [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] NotImplementedError [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] During handling of the above exception, another exception occurred: [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Traceback (most recent call last): [ 888.656198] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3570, in _rebuild_volume_backed_instance [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._detach_root_volume(context, instance, root_bdm) [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3549, in _detach_root_volume [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] with excutils.save_and_reraise_exception(): [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self.force_reraise() [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise self.value [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3535, in _detach_root_volume [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self.driver.detach_volume(context, old_connection_info, [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 542, in detach_volume [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] return self._volumeops.detach_volume(connection_info, instance) [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 888.656644] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._detach_volume_vmdk(connection_info, instance) [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] stable_ref.fetch_moref(session) [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise exception.InstanceNotFound(instance_id=self._uuid) [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] nova.exception.InstanceNotFound: Instance 63244459-f37b-4fdb-8afc-9e4a80156099 could not be found. [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] During handling of the above exception, another exception occurred: [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Traceback (most recent call last): [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 10738, in _error_out_instance_on_exception [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] yield [ 888.657201] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3826, in rebuild_instance [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._do_rebuild_instance_with_claim( [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3912, in _do_rebuild_instance_with_claim [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._do_rebuild_instance( [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 4104, in _do_rebuild_instance [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._rebuild_default_impl(**kwargs) [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3693, in _rebuild_default_impl [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] self._rebuild_volume_backed_instance( [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] File "/opt/stack/nova/nova/compute/manager.py", line 3585, in _rebuild_volume_backed_instance [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] raise exception.BuildAbortException( [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] nova.exception.BuildAbortException: Build of instance 63244459-f37b-4fdb-8afc-9e4a80156099 aborted: Failed to rebuild volume backed instance. [ 888.657606] env[66583]: ERROR nova.compute.manager [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] [ 888.740337] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 888.740657] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 888.961947] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e353575b-16e2-41a6-9f3d-53343ebe5958 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.969298] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86f54e75-06bb-4314-9bff-2619b6eb2de0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 888.999034] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bbf0d647-2c89-43c7-83dc-3f1ef39bb1ef {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 889.006280] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c69312b2-bbd2-4e30-b586-978719007599 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 889.020307] env[66583]: DEBUG nova.compute.provider_tree [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 889.028957] env[66583]: DEBUG nova.scheduler.client.report [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 889.042633] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.302s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 889.042884] env[66583]: INFO nova.compute.manager [None req-5492f36e-3921-4ce7-a932-dece873f1406 tempest-ServerActionsV293TestJSON-1600855171 tempest-ServerActionsV293TestJSON-1600855171-project-member] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Successfully reverted task state from rebuilding on failure for instance. [ 891.062215] env[66583]: DEBUG nova.compute.manager [req-681f71f0-3f09-403a-927f-10c24b20a2eb req-c52d77a1-4dae-4dfd-8cf7-01026138a7b8 service nova] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Received event network-vif-deleted-210b6f54-f21f-4204-ae34-04cfecde6270 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 894.478321] env[66583]: DEBUG nova.compute.manager [req-ae3c656b-a679-4964-90e3-f73948a6b9d7 req-7dabe1e7-0fa8-45ad-a8c2-4ba3eba163c2 service nova] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Received event network-vif-deleted-19aeb4fb-ff1b-49f7-978f-0b3c70290a9c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 894.478321] env[66583]: DEBUG nova.compute.manager [req-ae3c656b-a679-4964-90e3-f73948a6b9d7 req-7dabe1e7-0fa8-45ad-a8c2-4ba3eba163c2 service nova] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Received event network-vif-deleted-4a5ea210-e518-4297-bb4d-a3844f9a1b1e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 897.112848] env[66583]: DEBUG nova.compute.manager [req-b988c8ac-1433-4f03-bfbe-776a48ea37f0 req-97524130-10bb-4dce-9c28-0fe1c3f4a378 service nova] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Received event network-vif-deleted-442f498f-b231-40f3-9a0f-93dec6c5b76c {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 899.543607] env[66583]: DEBUG nova.compute.manager [req-b8faa7c7-b53d-4dd0-abb0-f568ba545480 req-5c991322-0665-459c-9888-c41f0ef7031b service nova] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Received event network-vif-deleted-d43f0eaf-b8ab-49e8-81fd-d9317a958b0b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 899.544027] env[66583]: DEBUG nova.compute.manager [req-b8faa7c7-b53d-4dd0-abb0-f568ba545480 req-5c991322-0665-459c-9888-c41f0ef7031b service nova] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Received event network-vif-deleted-43d71ce7-f3f2-477a-908d-0f01e957e661 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 899.544219] env[66583]: DEBUG nova.compute.manager [req-b8faa7c7-b53d-4dd0-abb0-f568ba545480 req-5c991322-0665-459c-9888-c41f0ef7031b service nova] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Received event network-vif-deleted-2708f533-5245-4118-bacf-0b47cd5d8cc3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 905.157822] env[66583]: WARNING oslo_vmware.rw_handles [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 905.157822] env[66583]: ERROR oslo_vmware.rw_handles [ 905.158470] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 905.160247] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 905.161476] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Copying Virtual Disk [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/f29d2a88-48cd-47a4-8856-6507cb5f5c46/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 905.161476] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c9b66555-f774-4b12-90b0-e55632c8a883 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.170971] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for the task: (returnval){ [ 905.170971] env[66583]: value = "task-3470324" [ 905.170971] env[66583]: _type = "Task" [ 905.170971] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 905.181106] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Task: {'id': task-3470324, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 905.689167] env[66583]: DEBUG oslo_vmware.exceptions [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 905.689559] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 905.690241] env[66583]: ERROR nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.690241] env[66583]: Faults: ['InvalidArgument'] [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] Traceback (most recent call last): [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] yield resources [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self.driver.spawn(context, instance, image_meta, [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self._vmops.spawn(context, instance, image_meta, injected_files, [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self._fetch_image_if_missing(context, vi) [ 905.690241] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] image_cache(vi, tmp_image_ds_loc) [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] vm_util.copy_virtual_disk( [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] session._wait_for_task(vmdk_copy_task) [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return self.wait_for_task(task_ref) [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return evt.wait() [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] result = hub.switch() [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 905.690726] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return self.greenlet.switch() [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self.f(*self.args, **self.kw) [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] raise exceptions.translate_fault(task_info.error) [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] Faults: ['InvalidArgument'] [ 905.691157] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] [ 905.691157] env[66583]: INFO nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Terminating instance [ 905.692880] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 905.692880] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 905.695915] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 905.696143] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 905.696399] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-35853910-b264-4686-b1c7-c3e5d5335a4a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.701404] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c051ace-d5f1-4726-b627-bcc8597f9dc8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.709232] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 905.709480] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5c530e20-df82-4650-8ad5-fadbbcb5e990 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.712221] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 905.712398] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 905.713614] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-89575242-71d7-460e-b0cd-d22dfd383200 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.718618] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 905.718618] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52c02e85-688c-6472-0300-d0b57aad3785" [ 905.718618] env[66583]: _type = "Task" [ 905.718618] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 905.726318] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52c02e85-688c-6472-0300-d0b57aad3785, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 905.775820] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 905.776084] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 905.776447] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Deleting the datastore file [datastore2] 3816b87a-030d-4362-9596-bd0899455e52 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 905.776540] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f14ca1e5-d9bb-4686-b30a-e02da6747845 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 905.786659] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for the task: (returnval){ [ 905.786659] env[66583]: value = "task-3470326" [ 905.786659] env[66583]: _type = "Task" [ 905.786659] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 905.795940] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Task: {'id': task-3470326, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 906.229846] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 906.230457] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating directory with path [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 906.231159] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc4cd5ac-3195-418a-ac1a-7076d4d8a235 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.244755] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created directory with path [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 906.244964] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Fetch image to [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 906.245148] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 906.246064] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-710ce289-9b57-45fc-ae05-1f83fb851e73 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.255721] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11e114ba-67c9-47d2-8992-5dad2882e647 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.264919] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e508713-9203-4db8-b996-8595274b8d82 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.303055] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6972b18-8a9f-4996-ac69-6e0c7d5baa49 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.311046] env[66583]: DEBUG oslo_vmware.api [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Task: {'id': task-3470326, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085753} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 906.314218] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 906.314218] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 906.314218] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 906.314218] env[66583]: INFO nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Took 0.62 seconds to destroy the instance on the hypervisor. [ 906.315636] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d40e43f4-2f11-4e1a-806d-67c6a4ce0900 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.318302] env[66583]: DEBUG nova.compute.claims [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 906.318302] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 906.318302] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 906.342604] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 906.456912] env[66583]: DEBUG oslo_vmware.rw_handles [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 906.519654] env[66583]: DEBUG oslo_vmware.rw_handles [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 906.519654] env[66583]: DEBUG oslo_vmware.rw_handles [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 906.624440] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c3b8952-a818-4a2b-8ba1-94fd90ad2571 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.633108] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-578b6cb5-db08-4715-946b-4baa540e2d36 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.665467] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9800740b-f0f2-4b78-add7-e91f152508bd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.676059] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f25664b-6ba5-4e99-b2e1-b52049ef9721 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 906.690628] env[66583]: DEBUG nova.compute.provider_tree [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 906.701676] env[66583]: DEBUG nova.scheduler.client.report [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 906.723380] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.403s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 906.723380] env[66583]: ERROR nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 906.723380] env[66583]: Faults: ['InvalidArgument'] [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] Traceback (most recent call last): [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self.driver.spawn(context, instance, image_meta, [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self._vmops.spawn(context, instance, image_meta, injected_files, [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 906.723380] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self._fetch_image_if_missing(context, vi) [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] image_cache(vi, tmp_image_ds_loc) [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] vm_util.copy_virtual_disk( [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] session._wait_for_task(vmdk_copy_task) [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return self.wait_for_task(task_ref) [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return evt.wait() [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] result = hub.switch() [ 906.724202] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] return self.greenlet.switch() [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] self.f(*self.args, **self.kw) [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] raise exceptions.translate_fault(task_info.error) [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] Faults: ['InvalidArgument'] [ 906.724943] env[66583]: ERROR nova.compute.manager [instance: 3816b87a-030d-4362-9596-bd0899455e52] [ 906.724943] env[66583]: DEBUG nova.compute.utils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 906.724943] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Build of instance 3816b87a-030d-4362-9596-bd0899455e52 was re-scheduled: A specified parameter was not correct: fileType [ 906.725283] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 906.725283] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 906.725358] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 906.725489] env[66583]: DEBUG nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 906.725609] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 907.118845] env[66583]: DEBUG nova.network.neutron [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.131081] env[66583]: INFO nova.compute.manager [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Took 0.41 seconds to deallocate network for instance. [ 907.231996] env[66583]: INFO nova.scheduler.client.report [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Deleted allocations for instance 3816b87a-030d-4362-9596-bd0899455e52 [ 907.253666] env[66583]: DEBUG oslo_concurrency.lockutils [None req-687ee19b-c140-4f8c-afb4-f1ede5bd973b tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 291.160s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.254855] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 93.686s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.255091] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Acquiring lock "3816b87a-030d-4362-9596-bd0899455e52-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 907.256020] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 907.256020] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 907.259183] env[66583]: INFO nova.compute.manager [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Terminating instance [ 907.261302] env[66583]: DEBUG nova.compute.manager [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 907.261753] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 907.262400] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c5d54d7b-ba47-4173-b705-c1d380efa23f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.270181] env[66583]: DEBUG nova.compute.manager [None req-fc79f6a0-9420-4e07-ac7c-1bebac286d1d tempest-ServerActionsTestOtherB-1703846471 tempest-ServerActionsTestOtherB-1703846471-project-member] [instance: 408735e7-0c1b-406e-b72d-8a0396830264] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 907.277872] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcaefcec-36ee-4602-a1f6-dce1a3257196 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 907.309815] env[66583]: WARNING nova.virt.vmwareapi.vmops [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3816b87a-030d-4362-9596-bd0899455e52 could not be found. [ 907.310428] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 907.310428] env[66583]: INFO nova.compute.manager [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Took 0.05 seconds to destroy the instance on the hypervisor. [ 907.310777] env[66583]: DEBUG oslo.service.loopingcall [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 907.310860] env[66583]: DEBUG nova.compute.manager [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 907.310911] env[66583]: DEBUG nova.network.neutron [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 907.347453] env[66583]: DEBUG nova.network.neutron [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 907.355936] env[66583]: INFO nova.compute.manager [-] [instance: 3816b87a-030d-4362-9596-bd0899455e52] Took 0.04 seconds to deallocate network for instance. [ 907.453778] env[66583]: DEBUG oslo_concurrency.lockutils [None req-5f8df56b-e896-4eb7-afce-a307f5bfe64d tempest-ServerDiagnosticsNegativeTest-698352656 tempest-ServerDiagnosticsNegativeTest-698352656-project-member] Lock "3816b87a-030d-4362-9596-bd0899455e52" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.199s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.333686] env[66583]: DEBUG nova.compute.manager [None req-fc79f6a0-9420-4e07-ac7c-1bebac286d1d tempest-ServerActionsTestOtherB-1703846471 tempest-ServerActionsTestOtherB-1703846471-project-member] [instance: 408735e7-0c1b-406e-b72d-8a0396830264] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 908.357830] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fc79f6a0-9420-4e07-ac7c-1bebac286d1d tempest-ServerActionsTestOtherB-1703846471 tempest-ServerActionsTestOtherB-1703846471-project-member] Lock "408735e7-0c1b-406e-b72d-8a0396830264" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.370s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.366951] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 908.418532] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.418832] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.420373] env[66583]: INFO nova.compute.claims [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 908.647894] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-207763a0-fcaf-4676-88fc-b8a712700753 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.660917] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e402ffcc-af5d-4f67-888c-1b1f5064ff0a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.697916] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-556c4a2f-8dfe-40cc-9177-ffe40b3ef9de {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.706992] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33d934c5-dd5d-457c-adc7-5a7413f6ef77 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.721648] env[66583]: DEBUG nova.compute.provider_tree [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 908.731102] env[66583]: DEBUG nova.scheduler.client.report [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 908.744454] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 908.745441] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 908.790339] env[66583]: DEBUG nova.compute.utils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 908.791579] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 908.791753] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 908.806793] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 908.874852] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 908.899257] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 908.899522] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 908.899681] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 908.899870] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 908.900042] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 908.900195] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 908.900409] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 908.900569] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 908.900738] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 908.900900] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 908.901085] env[66583]: DEBUG nova.virt.hardware [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 908.901933] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8717b4fa-ca70-45ad-8ea4-8788f2607b6e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.909779] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f34b1321-9c01-4238-9d64-ae51fd9870ea {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 908.914959] env[66583]: DEBUG nova.policy [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '1a2f40b0c6be4850b45d7af29b0ef446', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f54e44041809424ba5090e357365305c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 909.431913] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Successfully created port: a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 910.898436] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Successfully updated port: a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 910.909700] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 910.909850] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 910.910000] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 910.989754] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 911.011981] env[66583]: DEBUG nova.compute.manager [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Received event network-vif-plugged-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 911.012219] env[66583]: DEBUG oslo_concurrency.lockutils [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] Acquiring lock "87acbe03-624d-454c-b108-0566ca0d750e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 911.012428] env[66583]: DEBUG oslo_concurrency.lockutils [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] Lock "87acbe03-624d-454c-b108-0566ca0d750e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 911.012592] env[66583]: DEBUG oslo_concurrency.lockutils [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] Lock "87acbe03-624d-454c-b108-0566ca0d750e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 911.012758] env[66583]: DEBUG nova.compute.manager [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] No waiting events found dispatching network-vif-plugged-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 911.013031] env[66583]: WARNING nova.compute.manager [req-b8c204e4-453b-4490-b43a-e425701aa642 req-1e2e6255-f272-4eed-9c73-f2d393614a15 service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Received unexpected event network-vif-plugged-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 for instance with vm_state building and task_state spawning. [ 911.307362] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Updating instance_info_cache with network_info: [{"id": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "address": "fa:16:3e:ae:3b:98", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c7cc7e-9d", "ovs_interfaceid": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 911.325382] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 911.325704] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance network_info: |[{"id": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "address": "fa:16:3e:ae:3b:98", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c7cc7e-9d", "ovs_interfaceid": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 911.326088] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ae:3b:98', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7894814c-6be3-4b80-a08e-4a771bc05dd1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 911.338117] env[66583]: DEBUG oslo.service.loopingcall [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 911.341431] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 911.341431] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6baccee0-7099-45b3-99cf-568f4463de4a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.366679] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 911.366679] env[66583]: value = "task-3470327" [ 911.366679] env[66583]: _type = "Task" [ 911.366679] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 911.376875] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470327, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 911.878042] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470327, 'name': CreateVM_Task, 'duration_secs': 0.306258} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 911.878349] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 911.879192] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 911.879572] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 911.880117] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 911.880591] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1440cc08-4381-416c-9e7d-1c2f215048a3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 911.885617] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 911.885617] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5221f652-86c8-be4f-15a4-aec1b1c4c4ae" [ 911.885617] env[66583]: _type = "Task" [ 911.885617] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 911.899219] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5221f652-86c8-be4f-15a4-aec1b1c4c4ae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 912.398291] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 912.398616] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 912.398812] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 912.398952] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 912.399146] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 912.399406] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cb7132f4-86fa-479b-89f2-78533d85487f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 912.421517] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 912.421731] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 912.422512] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38a3f6f5-e3bb-49aa-854b-260fda466d38 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 912.428294] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 912.428294] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52968906-6c2d-4809-a950-cda615f5358f" [ 912.428294] env[66583]: _type = "Task" [ 912.428294] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 912.436913] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52968906-6c2d-4809-a950-cda615f5358f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 912.942170] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 912.942788] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating directory with path [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 912.942788] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d11b2ceb-b771-4a3f-a2f2-7aec2b959247 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 912.965268] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Created directory with path [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 912.965475] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Fetch image to [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 912.965671] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 912.966541] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9ed5768-cb7d-43f5-8019-177fab079cc0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 912.973796] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15c72367-413e-4194-8d7e-b88169c62db3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 912.983406] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e2b51ef-9cde-4178-b57e-96e9a5cd6295 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.016271] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-655e717f-d730-47f6-84a1-ea0d700e7c8a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.022671] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61a75369-2e8e-46b5-a31c-18972391c6fb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 913.047037] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 913.103641] env[66583]: DEBUG oslo_vmware.rw_handles [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 913.162224] env[66583]: DEBUG oslo_vmware.rw_handles [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 913.162224] env[66583]: DEBUG oslo_vmware.rw_handles [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 913.320979] env[66583]: DEBUG nova.compute.manager [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Received event network-changed-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 913.321347] env[66583]: DEBUG nova.compute.manager [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Refreshing instance network info cache due to event network-changed-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 913.321755] env[66583]: DEBUG oslo_concurrency.lockutils [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] Acquiring lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 913.321999] env[66583]: DEBUG oslo_concurrency.lockutils [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] Acquired lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 913.322350] env[66583]: DEBUG nova.network.neutron [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Refreshing network info cache for port a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 914.084455] env[66583]: DEBUG nova.network.neutron [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Updated VIF entry in instance network info cache for port a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 914.084778] env[66583]: DEBUG nova.network.neutron [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Updating instance_info_cache with network_info: [{"id": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "address": "fa:16:3e:ae:3b:98", "network": {"id": "5a89c5cc-a8f0-4409-98e7-e995179a187b", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.32", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "41f302a7ddc84085a05c55c0788e6a8e", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7894814c-6be3-4b80-a08e-4a771bc05dd1", "external-id": "nsx-vlan-transportzone-948", "segmentation_id": 948, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa2c7cc7e-9d", "ovs_interfaceid": "a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 914.100168] env[66583]: DEBUG oslo_concurrency.lockutils [req-ce0708f4-8951-4f4c-809d-7148a3d324ad req-4d59570e-5806-475d-b15a-9461513631dd service nova] Releasing lock "refresh_cache-87acbe03-624d-454c-b108-0566ca0d750e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 916.040984] env[66583]: DEBUG nova.compute.manager [req-0e4dbcbf-2d3e-4116-8478-0a91bbefd52c req-6ba7fd30-9e22-4dd1-b0c5-182e3d3ebe81 service nova] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Received event network-vif-deleted-a2c7cc7e-9df9-4df6-b4e9-338a6afbc2a3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 920.846712] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 923.847625] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.842937] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.861791] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.862060] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 924.862093] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 924.878127] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 924.879655] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 924.879655] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 924.879655] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.879655] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 924.879655] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.846193] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.846567] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 925.846808] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 926.847739] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 926.864429] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 926.864684] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 926.864863] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 926.865029] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 926.869282] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56374470-2014-46cb-bd97-517fb30ff547 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.879992] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ede0f69-5d86-450b-ad96-b481d29cb1d0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.894876] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-634aa9db-5dd4-442b-9ab6-f7be5c13dc1c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.902482] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-093ea12b-3156-439f-97c4-d31fcbafe364 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 926.939750] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180740MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 926.939914] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 926.940135] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 926.998292] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 926.998447] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 927.011511] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e7664037-62b0-4195-b935-eab75d232f5d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 927.023386] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 504d18e4-8457-431b-b6cb-b26a0c64b14b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 927.023616] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 927.024195] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 927.114304] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd346823-aead-433e-bf14-067bd23774b8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.124140] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87101b55-9fbd-41ef-964f-91abc703e872 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.159564] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68e16ade-67ca-4192-8dc0-93de5fb48f8a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.167707] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8111c88-96e8-46db-bba7-85975137bda3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 927.189194] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 927.202907] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 927.226444] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 927.226657] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.287s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 929.587634] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.587893] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 929.679967] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "68449c86-cda6-46ff-a349-c2072829257e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 929.680188] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "68449c86-cda6-46ff-a349-c2072829257e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 930.264455] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "035e8729-c02f-490e-a0e4-b8877b52e75b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 930.264455] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "035e8729-c02f-490e-a0e4-b8877b52e75b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 931.140752] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "e9136963-e0fc-4344-880b-a21549f2cf23" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 931.141074] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "e9136963-e0fc-4344-880b-a21549f2cf23" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.322699] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "89ccce06-2094-4f87-a77b-cad92d351dfa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.324266] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "89ccce06-2094-4f87-a77b-cad92d351dfa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 932.667034] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 932.667725] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 933.013745] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 933.013977] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 952.231435] env[66583]: WARNING oslo_vmware.rw_handles [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 952.231435] env[66583]: ERROR oslo_vmware.rw_handles [ 952.232147] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 952.233677] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 952.233927] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Copying Virtual Disk [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/918167b0-a0fc-41d2-9095-9c2d561add02/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 952.234214] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-879e3e24-c6f4-45be-a733-2e402353f62b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.241760] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 952.241760] env[66583]: value = "task-3470328" [ 952.241760] env[66583]: _type = "Task" [ 952.241760] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.249034] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': task-3470328, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.753188] env[66583]: DEBUG oslo_vmware.exceptions [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 952.753426] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 952.753966] env[66583]: ERROR nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 952.753966] env[66583]: Faults: ['InvalidArgument'] [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Traceback (most recent call last): [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] yield resources [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self.driver.spawn(context, instance, image_meta, [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self._fetch_image_if_missing(context, vi) [ 952.753966] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] image_cache(vi, tmp_image_ds_loc) [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] vm_util.copy_virtual_disk( [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] session._wait_for_task(vmdk_copy_task) [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return self.wait_for_task(task_ref) [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return evt.wait() [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] result = hub.switch() [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 952.754318] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return self.greenlet.switch() [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self.f(*self.args, **self.kw) [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] raise exceptions.translate_fault(task_info.error) [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Faults: ['InvalidArgument'] [ 952.754690] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] [ 952.754690] env[66583]: INFO nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Terminating instance [ 952.755893] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 952.756122] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 952.756355] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-12723697-9719-4c52-8844-006f4e9c9a9c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.759536] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 952.759745] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 952.760477] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bc6b05d-93bc-45e8-a02e-3e34fd908279 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.767285] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 952.767442] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9b8a61a5-39dd-4868-925f-5afdc87f571a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.769578] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 952.769742] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 952.770688] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5517b5b3-52b9-4782-a5b2-1d94f9f1ca2e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.775469] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for the task: (returnval){ [ 952.775469] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52081b56-0823-b5f8-2851-129360f8d8aa" [ 952.775469] env[66583]: _type = "Task" [ 952.775469] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.782495] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52081b56-0823-b5f8-2851-129360f8d8aa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 952.836661] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 952.836986] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 952.837193] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Deleting the datastore file [datastore2] fce1b601-0363-4447-b802-3ea5d3aa97a0 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 952.837448] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8c2b9930-c8e9-469f-ab24-d5febefa9282 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 952.843878] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 952.843878] env[66583]: value = "task-3470330" [ 952.843878] env[66583]: _type = "Task" [ 952.843878] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 952.851383] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': task-3470330, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 953.285781] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 953.286153] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Creating directory with path [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 953.286269] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-74f7e275-876d-48ad-b845-475190e414b2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.299562] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Created directory with path [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 953.299736] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Fetch image to [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 953.299907] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 953.300613] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8becc432-48de-460a-a0c9-1098e3cf09a8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.307012] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a348d79b-b0e9-4a67-b3d7-69142eabda0f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.315619] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e71e5d5-f841-4272-a437-8fd0bc275204 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.347597] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e2f817a-907b-4b76-a7e3-a746907d8179 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.353971] env[66583]: DEBUG oslo_vmware.api [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': task-3470330, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081156} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 953.355361] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 953.355547] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 953.355718] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 953.355899] env[66583]: INFO nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Took 0.60 seconds to destroy the instance on the hypervisor. [ 953.357594] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6b0a54e5-c8a0-4257-bf21-a12d83253ea4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.359369] env[66583]: DEBUG nova.compute.claims [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 953.359539] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 953.359743] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.379672] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 953.425782] env[66583]: DEBUG oslo_vmware.rw_handles [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 953.482893] env[66583]: DEBUG oslo_vmware.rw_handles [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 953.483093] env[66583]: DEBUG oslo_vmware.rw_handles [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 953.573739] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-017193bc-53ad-47f9-956c-c408f84cab93 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.581359] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d23d238b-1aee-4585-9331-233a08b6df2c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.610423] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd951982-316e-4177-b742-e1664a42bc81 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.617169] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e51c194-fb73-41a9-bf58-263e9f957cd7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 953.630095] env[66583]: DEBUG nova.compute.provider_tree [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 953.638520] env[66583]: DEBUG nova.scheduler.client.report [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 953.651223] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.291s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.651750] env[66583]: ERROR nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 953.651750] env[66583]: Faults: ['InvalidArgument'] [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Traceback (most recent call last): [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self.driver.spawn(context, instance, image_meta, [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self._fetch_image_if_missing(context, vi) [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] image_cache(vi, tmp_image_ds_loc) [ 953.651750] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] vm_util.copy_virtual_disk( [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] session._wait_for_task(vmdk_copy_task) [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return self.wait_for_task(task_ref) [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return evt.wait() [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] result = hub.switch() [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] return self.greenlet.switch() [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 953.652067] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] self.f(*self.args, **self.kw) [ 953.652499] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 953.652499] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] raise exceptions.translate_fault(task_info.error) [ 953.652499] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 953.652499] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Faults: ['InvalidArgument'] [ 953.652499] env[66583]: ERROR nova.compute.manager [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] [ 953.652499] env[66583]: DEBUG nova.compute.utils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 953.653724] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Build of instance fce1b601-0363-4447-b802-3ea5d3aa97a0 was re-scheduled: A specified parameter was not correct: fileType [ 953.653724] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 953.654103] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 953.654278] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 953.654445] env[66583]: DEBUG nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 953.654604] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 953.871529] env[66583]: DEBUG nova.network.neutron [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 953.883225] env[66583]: INFO nova.compute.manager [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Took 0.23 seconds to deallocate network for instance. [ 953.968721] env[66583]: INFO nova.scheduler.client.report [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Deleted allocations for instance fce1b601-0363-4447-b802-3ea5d3aa97a0 [ 953.991653] env[66583]: DEBUG oslo_concurrency.lockutils [None req-990305ce-8e1d-499a-bc0e-e1d3bd5e889b tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 331.909s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.992793] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 133.150s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.992995] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 953.993209] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 953.993367] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 953.995309] env[66583]: INFO nova.compute.manager [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Terminating instance [ 953.997717] env[66583]: DEBUG nova.compute.manager [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 953.997909] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 953.998349] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-88bf7fa2-c3c1-4b7c-8af8-635a700a4e46 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.008822] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c64cb79-6ceb-45a6-98e2-9a6e072a7e1c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.018405] env[66583]: DEBUG nova.compute.manager [None req-9cede889-2fbf-4812-833f-f974e9d0992a tempest-AttachInterfacesUnderV243Test-18895844 tempest-AttachInterfacesUnderV243Test-18895844-project-member] [instance: a5fa8d3d-ad60-4749-bba1-0e00538a543f] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.040024] env[66583]: WARNING nova.virt.vmwareapi.vmops [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance fce1b601-0363-4447-b802-3ea5d3aa97a0 could not be found. [ 954.040354] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 954.040545] env[66583]: INFO nova.compute.manager [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Took 0.04 seconds to destroy the instance on the hypervisor. [ 954.040785] env[66583]: DEBUG oslo.service.loopingcall [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 954.041025] env[66583]: DEBUG nova.compute.manager [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 954.041127] env[66583]: DEBUG nova.network.neutron [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 954.044568] env[66583]: DEBUG nova.compute.manager [None req-9cede889-2fbf-4812-833f-f974e9d0992a tempest-AttachInterfacesUnderV243Test-18895844 tempest-AttachInterfacesUnderV243Test-18895844-project-member] [instance: a5fa8d3d-ad60-4749-bba1-0e00538a543f] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.063532] env[66583]: DEBUG oslo_concurrency.lockutils [None req-9cede889-2fbf-4812-833f-f974e9d0992a tempest-AttachInterfacesUnderV243Test-18895844 tempest-AttachInterfacesUnderV243Test-18895844-project-member] Lock "a5fa8d3d-ad60-4749-bba1-0e00538a543f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.524s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.068219] env[66583]: DEBUG nova.network.neutron [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 954.072505] env[66583]: DEBUG nova.compute.manager [None req-682c93b0-1784-49bb-8a4e-923c025cf824 tempest-ServersTestBootFromVolume-1638954955 tempest-ServersTestBootFromVolume-1638954955-project-member] [instance: 0a575fbd-2390-401a-8df0-47a40e187c87] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.076203] env[66583]: INFO nova.compute.manager [-] [instance: fce1b601-0363-4447-b802-3ea5d3aa97a0] Took 0.03 seconds to deallocate network for instance. [ 954.096538] env[66583]: DEBUG nova.compute.manager [None req-682c93b0-1784-49bb-8a4e-923c025cf824 tempest-ServersTestBootFromVolume-1638954955 tempest-ServersTestBootFromVolume-1638954955-project-member] [instance: 0a575fbd-2390-401a-8df0-47a40e187c87] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.117709] env[66583]: DEBUG oslo_concurrency.lockutils [None req-682c93b0-1784-49bb-8a4e-923c025cf824 tempest-ServersTestBootFromVolume-1638954955 tempest-ServersTestBootFromVolume-1638954955-project-member] Lock "0a575fbd-2390-401a-8df0-47a40e187c87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.171s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.128119] env[66583]: DEBUG nova.compute.manager [None req-b3654290-ffbc-4a3b-8b9f-dc572cd7feb0 tempest-ServerMetadataTestJSON-1507349439 tempest-ServerMetadataTestJSON-1507349439-project-member] [instance: 1a9f02ca-7220-490c-81ed-bf2422173315] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.155203] env[66583]: DEBUG nova.compute.manager [None req-b3654290-ffbc-4a3b-8b9f-dc572cd7feb0 tempest-ServerMetadataTestJSON-1507349439 tempest-ServerMetadataTestJSON-1507349439-project-member] [instance: 1a9f02ca-7220-490c-81ed-bf2422173315] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.163937] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7d353572-f289-41d8-9d3c-ee0c0fa4b0a2 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "fce1b601-0363-4447-b802-3ea5d3aa97a0" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.177763] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b3654290-ffbc-4a3b-8b9f-dc572cd7feb0 tempest-ServerMetadataTestJSON-1507349439 tempest-ServerMetadataTestJSON-1507349439-project-member] Lock "1a9f02ca-7220-490c-81ed-bf2422173315" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.701s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.186609] env[66583]: DEBUG nova.compute.manager [None req-236ee1f0-ffad-4b81-864d-ae206bb4dd43 tempest-ServerActionsTestJSON-55625096 tempest-ServerActionsTestJSON-55625096-project-member] [instance: 2f03a941-3722-4df8-af76-3bd073f8927b] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.207512] env[66583]: DEBUG nova.compute.manager [None req-236ee1f0-ffad-4b81-864d-ae206bb4dd43 tempest-ServerActionsTestJSON-55625096 tempest-ServerActionsTestJSON-55625096-project-member] [instance: 2f03a941-3722-4df8-af76-3bd073f8927b] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.225121] env[66583]: DEBUG oslo_concurrency.lockutils [None req-236ee1f0-ffad-4b81-864d-ae206bb4dd43 tempest-ServerActionsTestJSON-55625096 tempest-ServerActionsTestJSON-55625096-project-member] Lock "2f03a941-3722-4df8-af76-3bd073f8927b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.976s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.233831] env[66583]: DEBUG nova.compute.manager [None req-11979b60-2afd-4522-8f39-da403183c148 tempest-AttachVolumeTestJSON-25219850 tempest-AttachVolumeTestJSON-25219850-project-member] [instance: d04a1c66-b45e-4266-9e98-2682f7fa42d3] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.255777] env[66583]: DEBUG nova.compute.manager [None req-11979b60-2afd-4522-8f39-da403183c148 tempest-AttachVolumeTestJSON-25219850 tempest-AttachVolumeTestJSON-25219850-project-member] [instance: d04a1c66-b45e-4266-9e98-2682f7fa42d3] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.275226] env[66583]: DEBUG oslo_concurrency.lockutils [None req-11979b60-2afd-4522-8f39-da403183c148 tempest-AttachVolumeTestJSON-25219850 tempest-AttachVolumeTestJSON-25219850-project-member] Lock "d04a1c66-b45e-4266-9e98-2682f7fa42d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.643s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.283589] env[66583]: DEBUG nova.compute.manager [None req-cbe21ec0-289f-4a4f-ab3e-5aa593f92ec5 tempest-ServerGroupTestJSON-906624891 tempest-ServerGroupTestJSON-906624891-project-member] [instance: b3cb9c35-714c-4ce5-b826-0c8398ed93b9] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.307619] env[66583]: DEBUG nova.compute.manager [None req-cbe21ec0-289f-4a4f-ab3e-5aa593f92ec5 tempest-ServerGroupTestJSON-906624891 tempest-ServerGroupTestJSON-906624891-project-member] [instance: b3cb9c35-714c-4ce5-b826-0c8398ed93b9] Instance disappeared before build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 954.327389] env[66583]: DEBUG oslo_concurrency.lockutils [None req-cbe21ec0-289f-4a4f-ab3e-5aa593f92ec5 tempest-ServerGroupTestJSON-906624891 tempest-ServerGroupTestJSON-906624891-project-member] Lock "b3cb9c35-714c-4ce5-b826-0c8398ed93b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.013s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.334753] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 954.379654] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 954.379890] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 954.381589] env[66583]: INFO nova.compute.claims [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 954.533920] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4dac28eb-1e80-49e6-9f94-52c4ad191ddf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.541207] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41197a34-c1fa-4556-bc87-54ad8799b9cd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.570193] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eb531fb-ba08-40d4-aefa-446b7b1f1537 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.576951] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d02df7f3-5f68-45ae-9e2d-8763cbb3e70a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.589849] env[66583]: DEBUG nova.compute.provider_tree [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 954.598318] env[66583]: DEBUG nova.scheduler.client.report [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 954.610542] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.231s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 954.611014] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 954.641267] env[66583]: DEBUG nova.compute.utils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 954.644289] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 954.644495] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 954.652055] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 954.697609] env[66583]: DEBUG nova.policy [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f6bbca3b39094af19146bad3011ff5fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4c695dca72bc49ad9a51e0b3031dca53', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 954.711465] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 954.732533] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 954.732770] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 954.732926] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 954.733126] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 954.733274] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 954.733418] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 954.733663] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 954.733877] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 954.734079] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 954.734256] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 954.734434] env[66583]: DEBUG nova.virt.hardware [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 954.735338] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0190bf85-1f6e-4be6-8a09-01627b6a63f3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.743488] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fee20e69-aff8-4e14-b46e-f31caaf7efb0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 954.967199] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Successfully created port: 8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 955.910152] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Successfully updated port: 8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 955.912984] env[66583]: DEBUG nova.compute.manager [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Received event network-vif-plugged-8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 955.914767] env[66583]: DEBUG oslo_concurrency.lockutils [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] Acquiring lock "e7664037-62b0-4195-b935-eab75d232f5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 955.915149] env[66583]: DEBUG oslo_concurrency.lockutils [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] Lock "e7664037-62b0-4195-b935-eab75d232f5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 955.915344] env[66583]: DEBUG oslo_concurrency.lockutils [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] Lock "e7664037-62b0-4195-b935-eab75d232f5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 955.915518] env[66583]: DEBUG nova.compute.manager [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] No waiting events found dispatching network-vif-plugged-8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 955.915685] env[66583]: WARNING nova.compute.manager [req-8651b301-171b-49d9-b8d6-05501fa80d8c req-63a0f2ac-3a42-4592-9115-151c8f7fefa6 service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Received unexpected event network-vif-plugged-8072033a-a4dd-44d9-bb86-29b61e02d495 for instance with vm_state building and task_state spawning. [ 955.925397] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 955.925686] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquired lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 955.925686] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 955.985489] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 956.460628] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Updating instance_info_cache with network_info: [{"id": "8072033a-a4dd-44d9-bb86-29b61e02d495", "address": "fa:16:3e:28:a7:15", "network": {"id": "dae679a0-b79c-4809-ad4f-75bce0be4ec2", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1511238989-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c695dca72bc49ad9a51e0b3031dca53", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8072033a-a4", "ovs_interfaceid": "8072033a-a4dd-44d9-bb86-29b61e02d495", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 956.471080] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Releasing lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 956.471372] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Instance network_info: |[{"id": "8072033a-a4dd-44d9-bb86-29b61e02d495", "address": "fa:16:3e:28:a7:15", "network": {"id": "dae679a0-b79c-4809-ad4f-75bce0be4ec2", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1511238989-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c695dca72bc49ad9a51e0b3031dca53", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8072033a-a4", "ovs_interfaceid": "8072033a-a4dd-44d9-bb86-29b61e02d495", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 956.471724] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:28:a7:15', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8039f411-8c97-48fe-a5a9-9f5a42e4e7c6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8072033a-a4dd-44d9-bb86-29b61e02d495', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 956.479131] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Creating folder: Project (4c695dca72bc49ad9a51e0b3031dca53). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 956.479584] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b7284c76-0028-404b-a1c1-97d07db84607 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 956.490438] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Created folder: Project (4c695dca72bc49ad9a51e0b3031dca53) in parent group-v693485. [ 956.490606] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Creating folder: Instances. Parent ref: group-v693548. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 956.490805] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0c9c2407-0434-4859-871b-7248aa43a181 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 956.499141] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Created folder: Instances in parent group-v693548. [ 956.499355] env[66583]: DEBUG oslo.service.loopingcall [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 956.499527] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 956.499704] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-359f54ba-5a69-486f-a32c-cd9a210576c4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 956.518801] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 956.518801] env[66583]: value = "task-3470333" [ 956.518801] env[66583]: _type = "Task" [ 956.518801] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 956.525226] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470333, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 957.027613] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470333, 'name': CreateVM_Task, 'duration_secs': 0.294654} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 957.027873] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 957.028444] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 957.028607] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 957.028920] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 957.029176] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-de3466a5-12de-4db9-8465-e32ab66dd706 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 957.033523] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Waiting for the task: (returnval){ [ 957.033523] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52233113-ad1d-57ea-1659-c7b5b6940bae" [ 957.033523] env[66583]: _type = "Task" [ 957.033523] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 957.040903] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52233113-ad1d-57ea-1659-c7b5b6940bae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 957.544718] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 957.545017] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 957.545244] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 957.636160] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "08689558-cc57-43c5-b56e-f9785b515717" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 957.636404] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "08689558-cc57-43c5-b56e-f9785b515717" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 957.934611] env[66583]: DEBUG nova.compute.manager [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Received event network-changed-8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 957.934798] env[66583]: DEBUG nova.compute.manager [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Refreshing instance network info cache due to event network-changed-8072033a-a4dd-44d9-bb86-29b61e02d495. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 957.935058] env[66583]: DEBUG oslo_concurrency.lockutils [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] Acquiring lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 957.935215] env[66583]: DEBUG oslo_concurrency.lockutils [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] Acquired lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 957.935382] env[66583]: DEBUG nova.network.neutron [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Refreshing network info cache for port 8072033a-a4dd-44d9-bb86-29b61e02d495 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 958.163561] env[66583]: DEBUG nova.network.neutron [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Updated VIF entry in instance network info cache for port 8072033a-a4dd-44d9-bb86-29b61e02d495. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 958.163916] env[66583]: DEBUG nova.network.neutron [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Updating instance_info_cache with network_info: [{"id": "8072033a-a4dd-44d9-bb86-29b61e02d495", "address": "fa:16:3e:28:a7:15", "network": {"id": "dae679a0-b79c-4809-ad4f-75bce0be4ec2", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-1511238989-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "4c695dca72bc49ad9a51e0b3031dca53", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8039f411-8c97-48fe-a5a9-9f5a42e4e7c6", "external-id": "nsx-vlan-transportzone-12", "segmentation_id": 12, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8072033a-a4", "ovs_interfaceid": "8072033a-a4dd-44d9-bb86-29b61e02d495", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 958.173071] env[66583]: DEBUG oslo_concurrency.lockutils [req-951f8d46-503c-4ab9-8797-47d2ed6626df req-d463ef63-41f9-47f5-af68-60c90803becb service nova] Releasing lock "refresh_cache-e7664037-62b0-4195-b935-eab75d232f5d" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 960.247930] env[66583]: WARNING oslo_vmware.rw_handles [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 960.247930] env[66583]: ERROR oslo_vmware.rw_handles [ 960.248591] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 960.249542] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 960.249777] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Copying Virtual Disk [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/74bce0ce-3904-42c6-aba0-ecfc266a8bf8/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 960.250054] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-e896cc9e-cfbe-4a3f-a096-4d35f6acfee7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.257429] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 960.257429] env[66583]: value = "task-3470334" [ 960.257429] env[66583]: _type = "Task" [ 960.257429] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.266123] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470334, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 960.768535] env[66583]: DEBUG oslo_vmware.exceptions [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 960.768733] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 960.769316] env[66583]: ERROR nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 960.769316] env[66583]: Faults: ['InvalidArgument'] [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Traceback (most recent call last): [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] yield resources [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] self.driver.spawn(context, instance, image_meta, [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] self._fetch_image_if_missing(context, vi) [ 960.769316] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] image_cache(vi, tmp_image_ds_loc) [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] vm_util.copy_virtual_disk( [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] session._wait_for_task(vmdk_copy_task) [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] return self.wait_for_task(task_ref) [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] return evt.wait() [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] result = hub.switch() [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 960.769686] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] return self.greenlet.switch() [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] self.f(*self.args, **self.kw) [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] raise exceptions.translate_fault(task_info.error) [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Faults: ['InvalidArgument'] [ 960.769983] env[66583]: ERROR nova.compute.manager [instance: 87acbe03-624d-454c-b108-0566ca0d750e] [ 960.769983] env[66583]: INFO nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Terminating instance [ 960.771186] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 960.771441] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 960.772202] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 960.772438] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 960.772663] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0432dc55-e036-4ed7-8e74-e8c4d211f0e9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.775237] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fa1b84b-8254-420d-9597-5d4aa05022c5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.782027] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 960.782227] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f69b5c0a-4b74-46da-a248-3b452d44662d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.784336] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 960.784512] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 960.785514] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52f6b408-102f-4671-962c-1e6f431c6fea {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.790915] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Waiting for the task: (returnval){ [ 960.790915] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52f8d05e-5a21-5e91-5962-a6312355d3fc" [ 960.790915] env[66583]: _type = "Task" [ 960.790915] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.797869] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52f8d05e-5a21-5e91-5962-a6312355d3fc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 960.853084] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 960.853322] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 960.853498] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Deleting the datastore file [datastore1] 87acbe03-624d-454c-b108-0566ca0d750e {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 960.853752] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-03fc3cc9-8006-4224-8ac1-7a3d536f5c35 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 960.860123] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Waiting for the task: (returnval){ [ 960.860123] env[66583]: value = "task-3470336" [ 960.860123] env[66583]: _type = "Task" [ 960.860123] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 960.867871] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470336, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 961.301877] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 961.302214] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Creating directory with path [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 961.302395] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-86927ccf-de3e-4c42-9c9c-b0d67b176796 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.314093] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Created directory with path [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 961.314295] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Fetch image to [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 961.314470] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 961.315241] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e037d1d6-5a0d-48d1-bcc4-a2e20ad17417 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.321842] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b208df8a-5dcd-47d7-98c5-a6ee9eefd558 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.330693] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed904101-4441-4004-bd97-cafde55c2048 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.364716] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6bb546e4-947f-4503-89a9-e346ef31a06c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.371196] env[66583]: DEBUG oslo_vmware.api [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Task: {'id': task-3470336, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074943} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 961.372531] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 961.372740] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 961.372937] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 961.373130] env[66583]: INFO nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 961.375125] env[66583]: DEBUG nova.compute.claims [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 961.375297] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.375504] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.377934] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d2c87bf2-a6f9-4807-a164-f2fd316c9ae0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.397191] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 961.399975] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.400380] env[66583]: DEBUG nova.compute.utils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance 87acbe03-624d-454c-b108-0566ca0d750e could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 961.401743] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 961.401913] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 961.402086] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 961.402239] env[66583]: DEBUG nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 961.402398] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 961.426012] env[66583]: DEBUG nova.network.neutron [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 961.434682] env[66583]: INFO nova.compute.manager [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Took 0.03 seconds to deallocate network for instance. [ 961.443555] env[66583]: DEBUG oslo_vmware.rw_handles [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 961.501654] env[66583]: DEBUG oslo_vmware.rw_handles [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 961.502102] env[66583]: DEBUG oslo_vmware.rw_handles [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 961.519010] env[66583]: DEBUG oslo_concurrency.lockutils [None req-96b17a12-f9ca-4e4d-9366-bc180178845b tempest-DeleteServersAdminTestJSON-90630079 tempest-DeleteServersAdminTestJSON-90630079-project-member] Lock "87acbe03-624d-454c-b108-0566ca0d750e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 246.077s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.530452] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 961.576360] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 961.576765] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 961.578360] env[66583]: INFO nova.compute.claims [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 961.763187] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-afd6e778-a1cb-4167-b6a4-19d898762a96 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.770629] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cede416d-eaf1-49a7-9b25-3ba181c88f30 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.799964] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec65d9cf-39f2-4c76-962b-10025994361e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.806508] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5012ae9-8d21-4857-b1a1-a5a1975df948 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.819187] env[66583]: DEBUG nova.compute.provider_tree [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 961.827948] env[66583]: DEBUG nova.scheduler.client.report [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 961.842402] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 961.842903] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 961.877834] env[66583]: DEBUG nova.compute.utils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 961.879110] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 961.879301] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 961.889978] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 961.956634] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 961.976887] env[66583]: DEBUG nova.policy [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f899691878e549e59f3e0e1ebe8ad2a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '09706dc60f2148b5a1b340af34b11f0d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 961.980243] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 961.980547] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 961.980723] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 961.980935] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 961.981104] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 961.981256] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 961.981466] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 961.981620] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 961.981786] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 961.981947] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 961.982170] env[66583]: DEBUG nova.virt.hardware [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 961.983331] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31f07fae-8bb3-4fe9-83f0-3fd7b634d9d3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 961.991113] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-668789ac-a8e5-4109-8364-c4945c63eb94 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 962.309114] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully created port: 01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 962.593680] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully created port: e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 963.295607] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully created port: 1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 963.773860] env[66583]: DEBUG nova.compute.manager [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-vif-plugged-01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 963.774150] env[66583]: DEBUG oslo_concurrency.lockutils [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] Acquiring lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 963.774365] env[66583]: DEBUG oslo_concurrency.lockutils [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 963.774833] env[66583]: DEBUG oslo_concurrency.lockutils [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 963.774833] env[66583]: DEBUG nova.compute.manager [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] No waiting events found dispatching network-vif-plugged-01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 963.774833] env[66583]: WARNING nova.compute.manager [req-a8ff65d4-f723-4b26-b281-3cdfc8c0088f req-c9de2e32-4c02-44a2-85a1-03f8d6472085 service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received unexpected event network-vif-plugged-01574144-1dad-4e27-a248-eba6720829eb for instance with vm_state building and task_state spawning. [ 963.857131] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully updated port: 01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 964.520031] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully updated port: e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 965.055298] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Successfully updated port: 1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 965.091039] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 965.091039] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 965.091039] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 965.136935] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 965.716789] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updating instance_info_cache with network_info: [{"id": "01574144-1dad-4e27-a248-eba6720829eb", "address": "fa:16:3e:c3:03:95", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01574144-1d", "ovs_interfaceid": "01574144-1dad-4e27-a248-eba6720829eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "address": "fa:16:3e:c1:b0:81", "network": {"id": "10fe8f36-5634-4d1d-9460-ef81f1feb6ed", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-783158005", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2aa2ea5-e3", "ovs_interfaceid": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "address": "fa:16:3e:41:5a:ab", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ae0c69b-bc", "ovs_interfaceid": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 965.728607] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Releasing lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 965.728972] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Instance network_info: |[{"id": "01574144-1dad-4e27-a248-eba6720829eb", "address": "fa:16:3e:c3:03:95", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01574144-1d", "ovs_interfaceid": "01574144-1dad-4e27-a248-eba6720829eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "address": "fa:16:3e:c1:b0:81", "network": {"id": "10fe8f36-5634-4d1d-9460-ef81f1feb6ed", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-783158005", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2aa2ea5-e3", "ovs_interfaceid": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "address": "fa:16:3e:41:5a:ab", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ae0c69b-bc", "ovs_interfaceid": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 965.729453] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c3:03:95', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f0c0b05e-6d10-474c-9173-4c8f1dacac9f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '01574144-1dad-4e27-a248-eba6720829eb', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:c1:b0:81', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6d62c1cf-f39a-4626-9552-f1e13c692636', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:41:5a:ab', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f0c0b05e-6d10-474c-9173-4c8f1dacac9f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ae0c69b-bccf-447b-a48c-9f0c85878068', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 965.741725] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating folder: Project (09706dc60f2148b5a1b340af34b11f0d). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 965.742267] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-93b120eb-0879-40ed-9d8b-dbf8adf5f918 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.753436] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created folder: Project (09706dc60f2148b5a1b340af34b11f0d) in parent group-v693485. [ 965.755080] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating folder: Instances. Parent ref: group-v693551. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 965.755080] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dcc658a9-8d9a-49d0-a43b-88d55b34cc28 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.762679] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created folder: Instances in parent group-v693551. [ 965.762966] env[66583]: DEBUG oslo.service.loopingcall [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 965.763174] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 965.763364] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c12774be-7b47-4b90-bb6d-79e4e0a8d674 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 965.786812] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 965.786812] env[66583]: value = "task-3470339" [ 965.786812] env[66583]: _type = "Task" [ 965.786812] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 965.794516] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470339, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 965.800801] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-changed-01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 965.800889] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing instance network info cache due to event network-changed-01574144-1dad-4e27-a248-eba6720829eb. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 965.801156] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquiring lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 965.801248] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquired lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 965.801409] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing network info cache for port 01574144-1dad-4e27-a248-eba6720829eb {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 966.296778] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470339, 'name': CreateVM_Task, 'duration_secs': 0.413599} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 966.297101] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 966.297896] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 966.298099] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 966.298444] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 966.298734] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-75b5f3f5-b36d-48af-8617-3ce42a1cb332 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 966.303673] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 966.303673] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52137761-1f2e-16ba-7a4f-2a2446876d57" [ 966.303673] env[66583]: _type = "Task" [ 966.303673] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 966.307170] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updated VIF entry in instance network info cache for port 01574144-1dad-4e27-a248-eba6720829eb. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 966.307678] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updating instance_info_cache with network_info: [{"id": "01574144-1dad-4e27-a248-eba6720829eb", "address": "fa:16:3e:c3:03:95", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01574144-1d", "ovs_interfaceid": "01574144-1dad-4e27-a248-eba6720829eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "address": "fa:16:3e:c1:b0:81", "network": {"id": "10fe8f36-5634-4d1d-9460-ef81f1feb6ed", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-783158005", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2aa2ea5-e3", "ovs_interfaceid": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "address": "fa:16:3e:41:5a:ab", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ae0c69b-bc", "ovs_interfaceid": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 966.313327] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52137761-1f2e-16ba-7a4f-2a2446876d57, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 966.317587] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Releasing lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 966.317810] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-vif-plugged-e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 966.317998] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquiring lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 966.318213] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 966.318372] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.318532] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] No waiting events found dispatching network-vif-plugged-e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 966.318694] env[66583]: WARNING nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received unexpected event network-vif-plugged-e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 for instance with vm_state building and task_state spawning. [ 966.318851] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-changed-e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 966.319065] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing instance network info cache due to event network-changed-e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 966.319292] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquiring lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 966.319435] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquired lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 966.319591] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing network info cache for port e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 966.734360] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updated VIF entry in instance network info cache for port e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 966.734824] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updating instance_info_cache with network_info: [{"id": "01574144-1dad-4e27-a248-eba6720829eb", "address": "fa:16:3e:c3:03:95", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01574144-1d", "ovs_interfaceid": "01574144-1dad-4e27-a248-eba6720829eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "address": "fa:16:3e:c1:b0:81", "network": {"id": "10fe8f36-5634-4d1d-9460-ef81f1feb6ed", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-783158005", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2aa2ea5-e3", "ovs_interfaceid": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "address": "fa:16:3e:41:5a:ab", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ae0c69b-bc", "ovs_interfaceid": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 966.745154] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Releasing lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 966.745424] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-vif-plugged-1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 966.745681] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquiring lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 966.745895] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 966.746221] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 966.746433] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] No waiting events found dispatching network-vif-plugged-1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 966.746612] env[66583]: WARNING nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received unexpected event network-vif-plugged-1ae0c69b-bccf-447b-a48c-9f0c85878068 for instance with vm_state building and task_state spawning. [ 966.746777] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Received event network-changed-1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 966.746933] env[66583]: DEBUG nova.compute.manager [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing instance network info cache due to event network-changed-1ae0c69b-bccf-447b-a48c-9f0c85878068. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 966.747141] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquiring lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 966.747287] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Acquired lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 966.747438] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Refreshing network info cache for port 1ae0c69b-bccf-447b-a48c-9f0c85878068 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 966.814610] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 966.814862] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 966.815155] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 967.015428] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updated VIF entry in instance network info cache for port 1ae0c69b-bccf-447b-a48c-9f0c85878068. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 967.015887] env[66583]: DEBUG nova.network.neutron [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updating instance_info_cache with network_info: [{"id": "01574144-1dad-4e27-a248-eba6720829eb", "address": "fa:16:3e:c3:03:95", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.172", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap01574144-1d", "ovs_interfaceid": "01574144-1dad-4e27-a248-eba6720829eb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "address": "fa:16:3e:c1:b0:81", "network": {"id": "10fe8f36-5634-4d1d-9460-ef81f1feb6ed", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-783158005", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.55", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6d62c1cf-f39a-4626-9552-f1e13c692636", "external-id": "nsx-vlan-transportzone-748", "segmentation_id": 748, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape2aa2ea5-e3", "ovs_interfaceid": "e2aa2ea5-e340-4fc1-99fb-57c4cb7ab5f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "address": "fa:16:3e:41:5a:ab", "network": {"id": "c958f418-6892-4343-a463-ec60f104e835", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-781036724", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.139", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f0c0b05e-6d10-474c-9173-4c8f1dacac9f", "external-id": "nsx-vlan-transportzone-693", "segmentation_id": 693, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ae0c69b-bc", "ovs_interfaceid": "1ae0c69b-bccf-447b-a48c-9f0c85878068", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 967.026462] env[66583]: DEBUG oslo_concurrency.lockutils [req-05d345e7-54e5-4c7d-9080-ea1b34397e50 req-1a3688bf-5897-423e-9e06-fc5f3a0959fe service nova] Releasing lock "refresh_cache-504d18e4-8457-431b-b6cb-b26a0c64b14b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 983.228761] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 983.846709] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.846684] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 984.846985] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 984.846985] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 984.859348] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.859641] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.859641] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 984.859790] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 984.860594] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.846576] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 985.846832] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 986.847671] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 986.847997] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 986.847997] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 986.858083] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 986.858305] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 986.858471] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 986.858626] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 986.859689] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2b138fd-6ca9-496b-83ff-493e6f8ab7d3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.867977] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edff113c-c8a5-4d46-a023-3bbd9fba23fc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.881990] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-beb59313-08ae-451c-a77e-b14c442e73c6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.888264] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd9db7a6-2637-4113-a955-1eaa772ac304 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 986.916304] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180960MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 986.916458] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 986.916648] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 986.962233] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.962685] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e7664037-62b0-4195-b935-eab75d232f5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.962685] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 504d18e4-8457-431b-b6cb-b26a0c64b14b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 986.995214] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.006090] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 68449c86-cda6-46ff-a349-c2072829257e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.016749] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 035e8729-c02f-490e-a0e4-b8877b52e75b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.028455] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e9136963-e0fc-4344-880b-a21549f2cf23 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.041652] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89ccce06-2094-4f87-a77b-cad92d351dfa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.051573] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.064287] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.074588] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 08689558-cc57-43c5-b56e-f9785b515717 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 987.074804] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 987.075007] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 987.197343] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6a40ee1-e8c3-47d1-9208-c1eec471deaa {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.204726] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5034f0a0-67e7-4e7c-8347-50509d1ab42a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.234161] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67c6c734-eebb-4f90-8761-c51406433347 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.240688] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af6f21ba-4965-416c-9339-b886b1625580 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 987.253438] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 987.261858] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 987.274943] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 987.275185] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.358s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 988.269708] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 999.761734] env[66583]: WARNING oslo_vmware.rw_handles [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 999.761734] env[66583]: ERROR oslo_vmware.rw_handles [ 999.762454] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 999.764205] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 999.764464] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Copying Virtual Disk [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore2] vmware_temp/a56d3f19-392b-4caa-84e9-f7a62bbf141b/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 999.764761] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f1fb3a51-5215-4957-bd9a-e1d3e41b5327 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 999.773954] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for the task: (returnval){ [ 999.773954] env[66583]: value = "task-3470341" [ 999.773954] env[66583]: _type = "Task" [ 999.773954] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 999.781519] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Task: {'id': task-3470341, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.283998] env[66583]: DEBUG oslo_vmware.exceptions [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1000.284258] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1000.284827] env[66583]: ERROR nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.284827] env[66583]: Faults: ['InvalidArgument'] [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Traceback (most recent call last): [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] yield resources [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self.driver.spawn(context, instance, image_meta, [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self._fetch_image_if_missing(context, vi) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] image_cache(vi, tmp_image_ds_loc) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] vm_util.copy_virtual_disk( [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] session._wait_for_task(vmdk_copy_task) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return self.wait_for_task(task_ref) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return evt.wait() [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] result = hub.switch() [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return self.greenlet.switch() [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self.f(*self.args, **self.kw) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] raise exceptions.translate_fault(task_info.error) [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Faults: ['InvalidArgument'] [ 1000.284827] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] [ 1000.285813] env[66583]: INFO nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Terminating instance [ 1000.286703] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1000.286916] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.287179] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f29052d5-8d41-4e78-9a9e-1741fc8fae01 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.289414] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1000.289607] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1000.290306] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49c234f4-a4f2-4c15-8152-2cc2309af24c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.296911] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1000.297114] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0f53c8e3-453b-4fe6-bc51-9f94b2efde93 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.299104] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.299274] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1000.300185] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b3b780a3-e24e-420b-ab48-8c14f4b4683f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.304855] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Waiting for the task: (returnval){ [ 1000.304855] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52eb8af6-5f2e-47b7-32eb-3eb4fe7c74d0" [ 1000.304855] env[66583]: _type = "Task" [ 1000.304855] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.311509] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52eb8af6-5f2e-47b7-32eb-3eb4fe7c74d0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.362429] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1000.362620] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1000.362797] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Deleting the datastore file [datastore2] 89e32d26-aa13-4b13-9aec-9e35513946e8 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1000.363064] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9d9e8249-ab47-434c-a458-fa010f05061a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.369014] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for the task: (returnval){ [ 1000.369014] env[66583]: value = "task-3470343" [ 1000.369014] env[66583]: _type = "Task" [ 1000.369014] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1000.376665] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Task: {'id': task-3470343, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1000.814956] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1000.815328] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Creating directory with path [datastore2] vmware_temp/62f92a1a-2e61-4c98-a261-f4e57e21f957/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1000.815486] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6efb2874-ca8c-4566-ae20-46db7c27011f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.827560] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Created directory with path [datastore2] vmware_temp/62f92a1a-2e61-4c98-a261-f4e57e21f957/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1000.827745] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Fetch image to [datastore2] vmware_temp/62f92a1a-2e61-4c98-a261-f4e57e21f957/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1000.827916] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/62f92a1a-2e61-4c98-a261-f4e57e21f957/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1000.828615] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bcaed3e-70bf-43e0-af68-d63610d51dc8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.835072] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83d9c59c-c12c-474e-8fb7-54e9f92dd5d6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.843779] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83f5c4c5-3fb7-40f6-9087-389494d5d6b3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.877565] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e5d106-2b9c-4072-bc80-5ebaf8e0e00d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.884241] env[66583]: DEBUG oslo_vmware.api [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Task: {'id': task-3470343, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07257} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1000.885626] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1000.885816] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1000.885994] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1000.886194] env[66583]: INFO nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1000.887916] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-107bc337-121b-4bd8-85a2-e33d6e6ae425 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1000.889767] env[66583]: DEBUG nova.compute.claims [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1000.889946] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1000.890174] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1000.911983] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1001.050526] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-088dee2f-bd6a-4320-b6b5-90109e9e7f46 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.058058] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cb51f112-fe72-4aaa-a16e-0aaac88606bf {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.087383] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1001.088958] env[66583]: ERROR nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] result = getattr(controller, method)(*args, **kwargs) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._get(image_id) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] resp, body = self.http_client.get(url, headers=header) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.request(url, 'GET', **kwargs) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._handle_response(resp) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exc.from_response(resp, resp.content) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] During handling of the above exception, another exception occurred: [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] yield resources [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.driver.spawn(context, instance, image_meta, [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._fetch_image_if_missing(context, vi) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image_fetch(context, vi, tmp_image_ds_loc) [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] images.fetch_image( [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1001.088958] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] metadata = IMAGE_API.get(context, image_ref) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return session.show(context, image_id, [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] _reraise_translated_image_exception(image_id) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise new_exc.with_traceback(exc_trace) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] result = getattr(controller, method)(*args, **kwargs) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._get(image_id) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] resp, body = self.http_client.get(url, headers=header) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.request(url, 'GET', **kwargs) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._handle_response(resp) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exc.from_response(resp, resp.content) [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1001.090012] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1001.090012] env[66583]: INFO nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Terminating instance [ 1001.091149] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c4b2288c-f06f-47f6-b42c-abcaa7fbc33e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.093515] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1001.093719] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1001.094314] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1001.094502] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1001.094711] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bdf1bb8a-2782-4b8a-abd2-6e12a8424737 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.096954] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64a7b9a8-75d2-4bfd-9ac3-8696358e1d7d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.104304] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23094aa4-fd86-4eeb-a83c-43cdc968e770 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.110117] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1001.110287] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1001.111434] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3fc9c97a-7dd3-4e70-a4c3-ad3ec7979739 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.122457] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1001.122859] env[66583]: DEBUG nova.compute.provider_tree [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.124150] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1c9123e8-6c51-479e-b761-e519a0223f79 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.126571] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Waiting for the task: (returnval){ [ 1001.126571] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5254e75b-d3c0-dbb5-31b7-893b5dead522" [ 1001.126571] env[66583]: _type = "Task" [ 1001.126571] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1001.135090] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5254e75b-d3c0-dbb5-31b7-893b5dead522, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1001.136325] env[66583]: DEBUG nova.scheduler.client.report [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.149317] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.259s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.149880] env[66583]: ERROR nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.149880] env[66583]: Faults: ['InvalidArgument'] [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Traceback (most recent call last): [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self.driver.spawn(context, instance, image_meta, [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self._fetch_image_if_missing(context, vi) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] image_cache(vi, tmp_image_ds_loc) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] vm_util.copy_virtual_disk( [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] session._wait_for_task(vmdk_copy_task) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return self.wait_for_task(task_ref) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return evt.wait() [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] result = hub.switch() [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] return self.greenlet.switch() [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] self.f(*self.args, **self.kw) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] raise exceptions.translate_fault(task_info.error) [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Faults: ['InvalidArgument'] [ 1001.149880] env[66583]: ERROR nova.compute.manager [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] [ 1001.150909] env[66583]: DEBUG nova.compute.utils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1001.151972] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Build of instance 89e32d26-aa13-4b13-9aec-9e35513946e8 was re-scheduled: A specified parameter was not correct: fileType [ 1001.151972] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1001.152352] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1001.152546] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1001.152783] env[66583]: DEBUG nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1001.152969] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.178050] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1001.178279] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1001.178463] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Deleting the datastore file [datastore2] 8c6830c9-f8e4-4c72-892c-3012cd9b84c0 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1001.178701] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-11cd0b6c-02fc-4676-a277-98a558892d13 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.186426] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Waiting for the task: (returnval){ [ 1001.186426] env[66583]: value = "task-3470345" [ 1001.186426] env[66583]: _type = "Task" [ 1001.186426] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1001.193327] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Task: {'id': task-3470345, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1001.377860] env[66583]: DEBUG nova.network.neutron [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.389339] env[66583]: INFO nova.compute.manager [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Took 0.24 seconds to deallocate network for instance. [ 1001.476498] env[66583]: INFO nova.scheduler.client.report [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Deleted allocations for instance 89e32d26-aa13-4b13-9aec-9e35513946e8 [ 1001.495783] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1c622db4-aec1-4b3c-bce8-9a5e990772e4 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 376.341s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.497202] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 177.512s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.497427] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Acquiring lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1001.497663] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.497838] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.499749] env[66583]: INFO nova.compute.manager [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Terminating instance [ 1001.501846] env[66583]: DEBUG nova.compute.manager [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1001.502205] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1001.502593] env[66583]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-50643056-3f0c-4025-90c7-403186a76e5f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.512678] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4d8ed7a-f2d6-49ac-9a83-f6b6ce56f988 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.522983] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1001.543213] env[66583]: WARNING nova.virt.vmwareapi.vmops [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 89e32d26-aa13-4b13-9aec-9e35513946e8 could not be found. [ 1001.543445] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1001.543637] env[66583]: INFO nova.compute.manager [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1001.543892] env[66583]: DEBUG oslo.service.loopingcall [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1001.544127] env[66583]: DEBUG nova.compute.manager [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1001.544292] env[66583]: DEBUG nova.network.neutron [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.570016] env[66583]: DEBUG nova.network.neutron [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1001.577965] env[66583]: INFO nova.compute.manager [-] [instance: 89e32d26-aa13-4b13-9aec-9e35513946e8] Took 0.03 seconds to deallocate network for instance. [ 1001.583073] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1001.583143] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.584465] env[66583]: INFO nova.compute.claims [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1001.639317] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1001.639317] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Creating directory with path [datastore2] vmware_temp/7769d523-1088-41af-a68b-b9df93a9d249/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1001.639403] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6eaad050-a372-48be-b9f8-c446ac278037 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.650666] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Created directory with path [datastore2] vmware_temp/7769d523-1088-41af-a68b-b9df93a9d249/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1001.650867] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Fetch image to [datastore2] vmware_temp/7769d523-1088-41af-a68b-b9df93a9d249/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1001.651055] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/7769d523-1088-41af-a68b-b9df93a9d249/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1001.651812] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0eed4d10-a544-4ec4-b5f5-d3830acff954 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.663466] env[66583]: DEBUG oslo_concurrency.lockutils [None req-10a75c63-a7ef-48ec-9cb6-960860a99b70 tempest-ServersTestFqdnHostnames-1041849233 tempest-ServersTestFqdnHostnames-1041849233-project-member] Lock "89e32d26-aa13-4b13-9aec-9e35513946e8" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.166s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.664905] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-887d65c7-c426-499f-9e55-9db4ed052d2b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.677337] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f998f313-22e5-499b-a7af-f76fff3e4c7e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.714591] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1c9da8-bc9d-41f8-8f91-3439df3db5e3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.722190] env[66583]: DEBUG oslo_vmware.api [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Task: {'id': task-3470345, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075082} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1001.725912] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1001.726156] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1001.726293] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1001.726472] env[66583]: INFO nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1001.728556] env[66583]: DEBUG nova.compute.claims [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1001.728720] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1001.729101] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-72aaa0ff-f8d9-419e-8af9-f69a600ca9f4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.800580] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eacc923d-989b-496b-b3b5-e44a636e44c3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.808363] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efd4f341-fe53-4162-8e1c-534fb0e4487e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.814974] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1001.841943] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d192277-1977-4430-be48-319082527a30 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.849849] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec381e42-2a9c-4b2d-84b5-3894297a8f08 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.864423] env[66583]: DEBUG nova.compute.provider_tree [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1001.873734] env[66583]: DEBUG nova.scheduler.client.report [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1001.886715] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.887265] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1001.890056] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.161s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1001.913433] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.023s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1001.914082] env[66583]: DEBUG nova.compute.utils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance 8c6830c9-f8e4-4c72-892c-3012cd9b84c0 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1001.915800] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1001.915963] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1001.916152] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1001.916331] env[66583]: DEBUG nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1001.916495] env[66583]: DEBUG nova.network.neutron [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1001.922496] env[66583]: DEBUG nova.compute.utils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1001.922496] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Not allocating networking since 'none' was specified. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 1001.931370] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1001.954259] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1001.955094] env[66583]: ERROR nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] result = getattr(controller, method)(*args, **kwargs) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._get(image_id) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] resp, body = self.http_client.get(url, headers=header) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.request(url, 'GET', **kwargs) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._handle_response(resp) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exc.from_response(resp, resp.content) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] During handling of the above exception, another exception occurred: [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] yield resources [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.driver.spawn(context, instance, image_meta, [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._fetch_image_if_missing(context, vi) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image_fetch(context, vi, tmp_image_ds_loc) [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] images.fetch_image( [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1001.955094] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] metadata = IMAGE_API.get(context, image_ref) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return session.show(context, image_id, [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] _reraise_translated_image_exception(image_id) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise new_exc.with_traceback(exc_trace) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] result = getattr(controller, method)(*args, **kwargs) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._get(image_id) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] resp, body = self.http_client.get(url, headers=header) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.request(url, 'GET', **kwargs) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._handle_response(resp) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exc.from_response(resp, resp.content) [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1001.955982] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1001.955982] env[66583]: INFO nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Terminating instance [ 1001.957187] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1001.957401] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1001.958048] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1001.958247] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1001.958480] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-28accf73-4f55-4d40-805a-39c8ab337ca7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.962764] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb9b8d26-bd7a-4913-b6a3-b521af5700f3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.974299] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1001.974581] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4864655e-6119-4349-8c0c-a0aa75c7410b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.977225] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1001.977417] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1001.978391] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b55015c8-a6cb-480e-841b-d47184ad79af {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1001.983179] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Waiting for the task: (returnval){ [ 1001.983179] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]524a7bfe-f82e-37fe-e82d-dcd1382cf698" [ 1001.983179] env[66583]: _type = "Task" [ 1001.983179] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1001.990675] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]524a7bfe-f82e-37fe-e82d-dcd1382cf698, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1001.999742] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1002.030454] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1002.030647] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1002.030700] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Deleting the datastore file [datastore2] 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1002.030920] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-159e8831-1d35-4992-84fe-c5ecc91814f0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.037723] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Waiting for the task: (returnval){ [ 1002.037723] env[66583]: value = "task-3470347" [ 1002.037723] env[66583]: _type = "Task" [ 1002.037723] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1002.038404] env[66583]: DEBUG neutronclient.v2_0.client [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1002.041456] env[66583]: ERROR nova.compute.manager [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] result = getattr(controller, method)(*args, **kwargs) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._get(image_id) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] resp, body = self.http_client.get(url, headers=header) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.request(url, 'GET', **kwargs) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._handle_response(resp) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exc.from_response(resp, resp.content) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] During handling of the above exception, another exception occurred: [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.driver.spawn(context, instance, image_meta, [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._fetch_image_if_missing(context, vi) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image_fetch(context, vi, tmp_image_ds_loc) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] images.fetch_image( [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] metadata = IMAGE_API.get(context, image_ref) [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return session.show(context, image_id, [ 1002.041456] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] _reraise_translated_image_exception(image_id) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise new_exc.with_traceback(exc_trace) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] result = getattr(controller, method)(*args, **kwargs) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._get(image_id) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] resp, body = self.http_client.get(url, headers=header) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.request(url, 'GET', **kwargs) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._handle_response(resp) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exc.from_response(resp, resp.content) [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] During handling of the above exception, another exception occurred: [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._build_and_run_instance(context, instance, image, [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] with excutils.save_and_reraise_exception(): [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.force_reraise() [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise self.value [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] with self.rt.instance_claim(context, instance, node, allocs, [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.abort() [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1002.042488] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return f(*args, **kwargs) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._unset_instance_host_and_node(instance) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] instance.save() [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] updates, result = self.indirection_api.object_action( [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return cctxt.call(context, 'object_action', objinst=objinst, [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] result = self.transport._send( [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._driver.send(target, ctxt, message, [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise result [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] nova.exception_Remote.InstanceNotFound_Remote: Instance 8c6830c9-f8e4-4c72-892c-3012cd9b84c0 could not be found. [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return getattr(target, method)(*args, **kwargs) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return fn(self, *args, **kwargs) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] old_ref, inst_ref = db.instance_update_and_get_original( [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return f(*args, **kwargs) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] with excutils.save_and_reraise_exception() as ectxt: [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.force_reraise() [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise self.value [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return f(*args, **kwargs) [ 1002.043469] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return f(context, *args, **kwargs) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exception.InstanceNotFound(instance_id=uuid) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] nova.exception.InstanceNotFound: Instance 8c6830c9-f8e4-4c72-892c-3012cd9b84c0 could not be found. [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] During handling of the above exception, another exception occurred: [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] exception_handler_v20(status_code, error_body) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise client_exc(message=error_message, [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Neutron server returns request_ids: ['req-29d57f33-3c5d-4de1-b6de-478303d7bda7'] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] During handling of the above exception, another exception occurred: [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] Traceback (most recent call last): [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._deallocate_network(context, instance, requested_networks) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self.network_api.deallocate_for_instance( [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] data = neutron.list_ports(**search_opts) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.list('ports', self.ports_path, retrieve_all, [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1002.044426] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] for r in self._pagination(collection, path, **params): [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] res = self.get(path, params=params) [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.retry_request("GET", action, body=body, [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] return self.do_request(method, action, body=body, [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] ret = obj(*args, **kwargs) [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] self._handle_fault_response(status_code, replybody, resp) [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] raise exception.Unauthorized() [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] nova.exception.Unauthorized: Not authorized. [ 1002.046961] env[66583]: ERROR nova.compute.manager [instance: 8c6830c9-f8e4-4c72-892c-3012cd9b84c0] [ 1002.054493] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Task: {'id': task-3470347, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1002.057113] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1002.057337] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1002.057494] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1002.057680] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1002.057827] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1002.057974] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1002.058207] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1002.058369] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1002.058533] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1002.058720] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1002.058900] env[66583]: DEBUG nova.virt.hardware [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1002.059660] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44652baf-e355-4d27-965a-b96d3f1969b9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.069853] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c40fb134-035d-4b35-9f7b-0f2f11d80d82 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.074127] env[66583]: DEBUG oslo_concurrency.lockutils [None req-d9ebf882-8fe8-46b6-8c32-5183d9f51bf5 tempest-ListImageFiltersTestJSON-480867675 tempest-ListImageFiltersTestJSON-480867675-project-member] Lock "8c6830c9-f8e4-4c72-892c-3012cd9b84c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 375.332s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1002.088335] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Instance VIF info [] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1002.093768] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Creating folder: Project (ff8fe03e324b4a5ab6ea2a053c93bfea). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1002.094143] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1002.097452] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6fcb9dd-4594-4f84-87c8-c729aac8e416 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.106426] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Created folder: Project (ff8fe03e324b4a5ab6ea2a053c93bfea) in parent group-v693485. [ 1002.106655] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Creating folder: Instances. Parent ref: group-v693554. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1002.106948] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-64cba3b2-c044-4e0a-b068-c18d3d284616 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.115175] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Created folder: Instances in parent group-v693554. [ 1002.115438] env[66583]: DEBUG oslo.service.loopingcall [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1002.115620] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1002.115811] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0584680a-9f72-49fd-876c-b4cdd0efd27d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.135215] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1002.135215] env[66583]: value = "task-3470350" [ 1002.135215] env[66583]: _type = "Task" [ 1002.135215] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1002.144367] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470350, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1002.145301] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.145532] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.146930] env[66583]: INFO nova.compute.claims [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1002.322220] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a373576-806a-4145-a5e2-5756d731c347 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.330523] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf124c8c-81eb-41ae-aa87-19e4292a4abe {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.360462] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2056e0f8-5489-47be-816d-ff5c61d7c95b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.367659] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec6a413c-fcf8-4604-bd20-b08cb66a4230 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.380280] env[66583]: DEBUG nova.compute.provider_tree [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1002.388674] env[66583]: DEBUG nova.scheduler.client.report [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1002.402803] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1002.403300] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1002.435265] env[66583]: DEBUG nova.compute.utils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1002.437158] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1002.437343] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1002.446100] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1002.494926] env[66583]: DEBUG nova.policy [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '15e500a8733e43bf9672ccbd90fc0561', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '60cc48aabaaf4189a35327c52cfdfce0', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1002.496430] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1002.496665] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Creating directory with path [datastore2] vmware_temp/dc5eaf69-e6e1-41d5-935d-1a331b8b8830/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1002.496886] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-726195b6-c62b-4147-9d8e-e99119f86dce {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.509519] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Created directory with path [datastore2] vmware_temp/dc5eaf69-e6e1-41d5-935d-1a331b8b8830/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1002.511224] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Fetch image to [datastore2] vmware_temp/dc5eaf69-e6e1-41d5-935d-1a331b8b8830/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1002.511224] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/dc5eaf69-e6e1-41d5-935d-1a331b8b8830/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1002.511487] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab8da75f-33fa-4f8f-86b5-a647e19eb8db {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.515817] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1002.524152] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed92ecc7-5893-4164-b1a3-afd977cdc562 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.535660] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a36ea45a-0f96-4324-a8c9-49903fd1d9e6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.568580] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1002.568826] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1002.568980] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1002.569199] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1002.569319] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1002.569673] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1002.569673] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1002.569808] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1002.569971] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1002.570147] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1002.570318] env[66583]: DEBUG nova.virt.hardware [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1002.573795] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00d91089-1a32-411a-9323-de42e3b48546 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.579073] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a717bc19-d865-49c5-ab1b-96febce5db30 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.583609] env[66583]: DEBUG oslo_vmware.api [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Task: {'id': task-3470347, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074559} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1002.586802] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1002.586996] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1002.587186] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1002.591506] env[66583]: INFO nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1002.591506] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce9266bb-b65a-4959-a637-f21dacd51add {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.596429] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f32d12e8-fffa-412e-8a64-e72b63bc05bc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.598550] env[66583]: DEBUG nova.compute.claims [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1002.598726] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.598935] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.627936] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1002.628721] env[66583]: DEBUG nova.compute.utils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1002.633099] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1002.634383] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1002.634780] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1002.634780] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1002.635018] env[66583]: DEBUG nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1002.635084] env[66583]: DEBUG nova.network.neutron [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1002.645188] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470350, 'name': CreateVM_Task, 'duration_secs': 0.233485} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1002.645361] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1002.645752] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1002.645906] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1002.646227] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1002.646493] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c44723b2-06f7-43f4-bc99-7dc26bab1d7a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.651623] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for the task: (returnval){ [ 1002.651623] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52810a1b-ed90-36e6-a60f-141d49410140" [ 1002.651623] env[66583]: _type = "Task" [ 1002.651623] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1002.661399] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52810a1b-ed90-36e6-a60f-141d49410140, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1002.678198] env[66583]: DEBUG neutronclient.v2_0.client [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1002.680328] env[66583]: ERROR nova.compute.manager [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] result = getattr(controller, method)(*args, **kwargs) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._get(image_id) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] resp, body = self.http_client.get(url, headers=header) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.request(url, 'GET', **kwargs) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._handle_response(resp) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exc.from_response(resp, resp.content) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] During handling of the above exception, another exception occurred: [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.driver.spawn(context, instance, image_meta, [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._fetch_image_if_missing(context, vi) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image_fetch(context, vi, tmp_image_ds_loc) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] images.fetch_image( [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] metadata = IMAGE_API.get(context, image_ref) [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return session.show(context, image_id, [ 1002.680328] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] _reraise_translated_image_exception(image_id) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise new_exc.with_traceback(exc_trace) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] result = getattr(controller, method)(*args, **kwargs) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._get(image_id) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] resp, body = self.http_client.get(url, headers=header) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.request(url, 'GET', **kwargs) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._handle_response(resp) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exc.from_response(resp, resp.content) [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] During handling of the above exception, another exception occurred: [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._build_and_run_instance(context, instance, image, [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] with excutils.save_and_reraise_exception(): [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.force_reraise() [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise self.value [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] with self.rt.instance_claim(context, instance, node, allocs, [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.abort() [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1002.681729] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return f(*args, **kwargs) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._unset_instance_host_and_node(instance) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] instance.save() [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] updates, result = self.indirection_api.object_action( [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return cctxt.call(context, 'object_action', objinst=objinst, [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] result = self.transport._send( [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._driver.send(target, ctxt, message, [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise result [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] nova.exception_Remote.InstanceNotFound_Remote: Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 could not be found. [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return getattr(target, method)(*args, **kwargs) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return fn(self, *args, **kwargs) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] old_ref, inst_ref = db.instance_update_and_get_original( [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return f(*args, **kwargs) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] with excutils.save_and_reraise_exception() as ectxt: [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.force_reraise() [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise self.value [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return f(*args, **kwargs) [ 1002.682616] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return f(context, *args, **kwargs) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exception.InstanceNotFound(instance_id=uuid) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] nova.exception.InstanceNotFound: Instance 4fde404c-9011-4e1a-8b3c-8c89e5e45c00 could not be found. [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] During handling of the above exception, another exception occurred: [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] exception_handler_v20(status_code, error_body) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise client_exc(message=error_message, [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Neutron server returns request_ids: ['req-19344a60-0a4f-4abc-8ee8-bd0fb02a2c9d'] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] During handling of the above exception, another exception occurred: [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Traceback (most recent call last): [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._deallocate_network(context, instance, requested_networks) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self.network_api.deallocate_for_instance( [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] data = neutron.list_ports(**search_opts) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.list('ports', self.ports_path, retrieve_all, [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1002.687429] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] for r in self._pagination(collection, path, **params): [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] res = self.get(path, params=params) [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.retry_request("GET", action, body=body, [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] return self.do_request(method, action, body=body, [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] ret = obj(*args, **kwargs) [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] self._handle_fault_response(status_code, replybody, resp) [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] raise exception.Unauthorized() [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] nova.exception.Unauthorized: Not authorized. [ 1002.689655] env[66583]: ERROR nova.compute.manager [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] [ 1002.712138] env[66583]: DEBUG oslo_concurrency.lockutils [None req-bd196e80-fca6-4a3a-92ca-078423d823a0 tempest-ServerRescueNegativeTestJSON-1611321061 tempest-ServerRescueNegativeTestJSON-1611321061-project-member] Lock "4fde404c-9011-4e1a-8b3c-8c89e5e45c00" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 311.641s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1002.725219] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1002.764074] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1002.764908] env[66583]: ERROR nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] result = getattr(controller, method)(*args, **kwargs) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._get(image_id) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] resp, body = self.http_client.get(url, headers=header) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.request(url, 'GET', **kwargs) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._handle_response(resp) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exc.from_response(resp, resp.content) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] During handling of the above exception, another exception occurred: [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] yield resources [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.driver.spawn(context, instance, image_meta, [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._fetch_image_if_missing(context, vi) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image_fetch(context, vi, tmp_image_ds_loc) [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] images.fetch_image( [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1002.764908] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] metadata = IMAGE_API.get(context, image_ref) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return session.show(context, image_id, [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] _reraise_translated_image_exception(image_id) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise new_exc.with_traceback(exc_trace) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] result = getattr(controller, method)(*args, **kwargs) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._get(image_id) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] resp, body = self.http_client.get(url, headers=header) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.request(url, 'GET', **kwargs) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._handle_response(resp) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exc.from_response(resp, resp.content) [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1002.765924] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1002.765924] env[66583]: INFO nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Terminating instance [ 1002.767239] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1002.767438] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1002.768428] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1002.768619] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1002.768848] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0636bdab-955e-4ac2-87b5-f5ec1ca6bf48 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.775016] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e805b8fc-6950-4662-baf7-1bff47a1c1d0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.783765] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1002.784081] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cee2044b-03f9-45d1-92d8-95d80a6eb437 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.793439] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1002.793554] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1002.795097] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1002.795352] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1002.798015] env[66583]: INFO nova.compute.claims [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1002.801297] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0a48e152-3139-45e0-bc62-ef70cc75eed7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.805920] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Waiting for the task: (returnval){ [ 1002.805920] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52ef2873-f578-7362-d48e-27c7c64ab5fb" [ 1002.805920] env[66583]: _type = "Task" [ 1002.805920] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1002.813987] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52ef2873-f578-7362-d48e-27c7c64ab5fb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1002.856356] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1002.856621] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1002.856802] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Deleting the datastore file [datastore2] 12bc9e29-ecea-40e9-af34-a067f3d2301f {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1002.857218] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-67a86e63-9346-48fa-a58e-c020fb2d7d93 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1002.863900] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Waiting for the task: (returnval){ [ 1002.863900] env[66583]: value = "task-3470352" [ 1002.863900] env[66583]: _type = "Task" [ 1002.863900] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1002.872323] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Task: {'id': task-3470352, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.000490] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Successfully created port: 06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1003.030850] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ee5fd1-3712-49ce-af43-8d418bb20496 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.038893] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e5f8887-52cd-4601-b86f-bbcecbae1a0c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.072059] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699926b6-90e1-496b-9bbb-6b696f943548 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.080018] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5b91b8e-ff3e-4930-90cb-afc57fe24c8f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.092715] env[66583]: DEBUG nova.compute.provider_tree [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1003.101988] env[66583]: DEBUG nova.scheduler.client.report [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1003.114482] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.114977] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1003.149451] env[66583]: DEBUG nova.compute.utils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1003.151872] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1003.152070] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1003.170341] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1003.172865] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1003.172865] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1003.176237] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1003.241102] env[66583]: DEBUG nova.policy [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7182b038163940ebaa315bce567f4870', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e450d78a63a64fcda4b141e03517015c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1003.245040] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1003.269885] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1003.270050] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1003.270183] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1003.270479] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1003.270627] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1003.271015] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1003.271015] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1003.271151] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1003.271331] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1003.271498] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1003.271672] env[66583]: DEBUG nova.virt.hardware [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1003.272799] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b0df085-00c1-4767-a4f3-38031e096dd9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.281766] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68d1ee42-97c2-4d96-aeca-4f76ffce19c1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.316762] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1003.317414] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Creating directory with path [datastore2] vmware_temp/6929a605-6d26-43f0-ad40-bd050ecea024/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1003.317674] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f253bef8-3008-471e-8368-5de302bf78a6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.329927] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Created directory with path [datastore2] vmware_temp/6929a605-6d26-43f0-ad40-bd050ecea024/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1003.330174] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Fetch image to [datastore2] vmware_temp/6929a605-6d26-43f0-ad40-bd050ecea024/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1003.330425] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/6929a605-6d26-43f0-ad40-bd050ecea024/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1003.331155] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5339e2fb-e2aa-4f49-81e6-379e7cd7eff2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.340414] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1a5f4d5-f615-449e-95e5-4dfd763795ad {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.355398] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-977d8a61-3730-4dc4-9a82-5420c87abf68 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.393411] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e64382f-e8c2-4a09-9e64-afb1940e8055 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.400637] env[66583]: DEBUG oslo_vmware.api [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Task: {'id': task-3470352, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077229} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1003.402169] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1003.402365] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1003.402542] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1003.402716] env[66583]: INFO nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1003.404800] env[66583]: DEBUG nova.compute.claims [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1003.404975] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.405250] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.408007] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0b53fd9f-e7d7-4ade-a3c5-39c20c5aef48 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.432290] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1003.439955] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.035s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.440752] env[66583]: DEBUG nova.compute.utils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1003.444340] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1003.444546] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1003.444723] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1003.444892] env[66583]: DEBUG nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1003.445145] env[66583]: DEBUG nova.network.neutron [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1003.484595] env[66583]: DEBUG neutronclient.v2_0.client [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1003.485199] env[66583]: ERROR nova.compute.manager [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] result = getattr(controller, method)(*args, **kwargs) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._get(image_id) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] resp, body = self.http_client.get(url, headers=header) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.request(url, 'GET', **kwargs) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._handle_response(resp) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exc.from_response(resp, resp.content) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] During handling of the above exception, another exception occurred: [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.driver.spawn(context, instance, image_meta, [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._fetch_image_if_missing(context, vi) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image_fetch(context, vi, tmp_image_ds_loc) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] images.fetch_image( [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] metadata = IMAGE_API.get(context, image_ref) [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return session.show(context, image_id, [ 1003.485199] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] _reraise_translated_image_exception(image_id) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise new_exc.with_traceback(exc_trace) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] result = getattr(controller, method)(*args, **kwargs) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._get(image_id) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] resp, body = self.http_client.get(url, headers=header) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.request(url, 'GET', **kwargs) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._handle_response(resp) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exc.from_response(resp, resp.content) [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] During handling of the above exception, another exception occurred: [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._build_and_run_instance(context, instance, image, [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] with excutils.save_and_reraise_exception(): [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.force_reraise() [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise self.value [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] with self.rt.instance_claim(context, instance, node, allocs, [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.abort() [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1003.486947] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return f(*args, **kwargs) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._unset_instance_host_and_node(instance) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] instance.save() [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] updates, result = self.indirection_api.object_action( [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] result = self.transport._send( [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._driver.send(target, ctxt, message, [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise result [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] nova.exception_Remote.InstanceNotFound_Remote: Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f could not be found. [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return getattr(target, method)(*args, **kwargs) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return fn(self, *args, **kwargs) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return f(*args, **kwargs) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] with excutils.save_and_reraise_exception() as ectxt: [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.force_reraise() [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise self.value [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return f(*args, **kwargs) [ 1003.488031] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return f(context, *args, **kwargs) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exception.InstanceNotFound(instance_id=uuid) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] nova.exception.InstanceNotFound: Instance 12bc9e29-ecea-40e9-af34-a067f3d2301f could not be found. [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] During handling of the above exception, another exception occurred: [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] exception_handler_v20(status_code, error_body) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise client_exc(message=error_message, [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Neutron server returns request_ids: ['req-aa0f1457-0032-4c2b-802c-e7cf742962dc'] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] During handling of the above exception, another exception occurred: [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Traceback (most recent call last): [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._deallocate_network(context, instance, requested_networks) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self.network_api.deallocate_for_instance( [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] data = neutron.list_ports(**search_opts) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.list('ports', self.ports_path, retrieve_all, [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1003.489616] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] for r in self._pagination(collection, path, **params): [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] res = self.get(path, params=params) [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.retry_request("GET", action, body=body, [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] return self.do_request(method, action, body=body, [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] ret = obj(*args, **kwargs) [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] self._handle_fault_response(status_code, replybody, resp) [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] raise exception.Unauthorized() [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] nova.exception.Unauthorized: Not authorized. [ 1003.490560] env[66583]: ERROR nova.compute.manager [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] [ 1003.514539] env[66583]: DEBUG oslo_concurrency.lockutils [None req-1821e1a7-9ac6-4661-a484-e90b46533c6b tempest-AttachInterfacesTestJSON-1565816277 tempest-AttachInterfacesTestJSON-1565816277-project-member] Lock "12bc9e29-ecea-40e9-af34-a067f3d2301f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 312.158s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.525572] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1003.556714] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1003.557533] env[66583]: ERROR nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] result = getattr(controller, method)(*args, **kwargs) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._get(image_id) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] resp, body = self.http_client.get(url, headers=header) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.request(url, 'GET', **kwargs) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._handle_response(resp) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exc.from_response(resp, resp.content) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] During handling of the above exception, another exception occurred: [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] yield resources [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.driver.spawn(context, instance, image_meta, [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._fetch_image_if_missing(context, vi) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image_fetch(context, vi, tmp_image_ds_loc) [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] images.fetch_image( [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1003.557533] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] metadata = IMAGE_API.get(context, image_ref) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return session.show(context, image_id, [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] _reraise_translated_image_exception(image_id) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise new_exc.with_traceback(exc_trace) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] result = getattr(controller, method)(*args, **kwargs) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._get(image_id) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] resp, body = self.http_client.get(url, headers=header) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.request(url, 'GET', **kwargs) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._handle_response(resp) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exc.from_response(resp, resp.content) [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1003.558498] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1003.558498] env[66583]: INFO nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Terminating instance [ 1003.559540] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.559761] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1003.560412] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1003.560655] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1003.560915] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e8ab4329-9e00-4d27-b50a-d885b895be10 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.563694] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8e9bbbe-b7f6-4518-bdc5-fc0f0dab422c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.574895] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1003.576058] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ec637180-f0a6-418d-a59b-66255a263bcd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.577634] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1003.577809] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1003.578712] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cd3b9b8-37ef-403d-a414-94c212d777df {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.584748] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Waiting for the task: (returnval){ [ 1003.584748] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52ea2084-f357-a6a0-8610-ee087f5f17dc" [ 1003.584748] env[66583]: _type = "Task" [ 1003.584748] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.589286] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1003.589546] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1003.591122] env[66583]: INFO nova.compute.claims [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1003.599953] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1003.599953] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Creating directory with path [datastore2] vmware_temp/a02843f2-f9b0-4586-a0c6-d0f5eb8b1d57/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1003.600170] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f9bfa39f-3516-426d-aabe-6c29b176eeb5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.620419] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Created directory with path [datastore2] vmware_temp/a02843f2-f9b0-4586-a0c6-d0f5eb8b1d57/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1003.620704] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Fetch image to [datastore2] vmware_temp/a02843f2-f9b0-4586-a0c6-d0f5eb8b1d57/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1003.620886] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/a02843f2-f9b0-4586-a0c6-d0f5eb8b1d57/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1003.621686] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84b56dd7-0b9f-4712-a7c9-a4480fa34b33 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.632173] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b582ab7-df1e-475f-b7c1-e62f3e1eced3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.643518] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-233915a2-5a33-49fb-b699-e22539209525 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.648327] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1003.648327] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1003.648327] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Deleting the datastore file [datastore2] 6deed686-ceca-45a1-b8e4-2461b2e3f039 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1003.648680] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-66cc8db0-1bbd-4845-92fb-428214f264b7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.659884] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Waiting for the task: (returnval){ [ 1003.659884] env[66583]: value = "task-3470354" [ 1003.659884] env[66583]: _type = "Task" [ 1003.659884] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.691562] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a6becca-811f-42ad-aadc-3db834812f76 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.700419] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Task: {'id': task-3470354, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.702281] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0d003bd8-bcb0-4757-9d1f-451924585e41 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.721877] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1003.858594] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26a8b321-8226-4a79-a8ba-2c94af051167 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.867140] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06d82d98-b7f6-4577-a36c-167117db037c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.901328] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28e6566d-02ac-4db1-acbf-bc39513fe52f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.909355] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08d9e38e-b406-4250-bd42-6fb3c80a8f0b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.914259] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1003.914802] env[66583]: ERROR nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] result = getattr(controller, method)(*args, **kwargs) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._get(image_id) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] resp, body = self.http_client.get(url, headers=header) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.request(url, 'GET', **kwargs) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._handle_response(resp) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exc.from_response(resp, resp.content) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] During handling of the above exception, another exception occurred: [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] yield resources [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.driver.spawn(context, instance, image_meta, [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._fetch_image_if_missing(context, vi) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image_fetch(context, vi, tmp_image_ds_loc) [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] images.fetch_image( [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1003.914802] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] metadata = IMAGE_API.get(context, image_ref) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return session.show(context, image_id, [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] _reraise_translated_image_exception(image_id) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise new_exc.with_traceback(exc_trace) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] result = getattr(controller, method)(*args, **kwargs) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._get(image_id) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] resp, body = self.http_client.get(url, headers=header) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.request(url, 'GET', **kwargs) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._handle_response(resp) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exc.from_response(resp, resp.content) [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1003.915988] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1003.915988] env[66583]: INFO nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Terminating instance [ 1003.916870] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1003.917087] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1003.917686] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1003.917866] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1003.918429] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ebc0e83b-7cfe-4d62-84d0-940bcd0b428b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.921069] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a51ef118-b527-4567-b448-d771791c024c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.931583] env[66583]: DEBUG nova.compute.provider_tree [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1003.938047] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1003.939130] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-f8e88b4b-6c54-48bf-9f3e-07710f343e87 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.940819] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1003.940997] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1003.942272] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8bab928a-c532-4669-a185-eb6705adf307 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1003.944966] env[66583]: DEBUG nova.scheduler.client.report [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1003.952422] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Waiting for the task: (returnval){ [ 1003.952422] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5218e38e-50a5-45f1-d583-94bd314c5fb7" [ 1003.952422] env[66583]: _type = "Task" [ 1003.952422] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1003.962170] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5218e38e-50a5-45f1-d583-94bd314c5fb7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1003.967610] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1003.968195] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1004.003196] env[66583]: DEBUG nova.compute.utils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1004.004602] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1004.004779] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1004.016474] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1004.016568] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1004.018032] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Deleting the datastore file [datastore2] 9915557d-4251-44a2-bf59-3dd542dfb527 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1004.018032] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f181b6b-4e80-4624-afd2-73934fd1780d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.020037] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1004.027457] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Waiting for the task: (returnval){ [ 1004.027457] env[66583]: value = "task-3470356" [ 1004.027457] env[66583]: _type = "Task" [ 1004.027457] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1004.038314] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Task: {'id': task-3470356, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1004.093392] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1004.118164] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1004.118440] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1004.118599] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1004.118783] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1004.118930] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1004.119096] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1004.119321] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1004.119486] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1004.119657] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1004.120188] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1004.120188] env[66583]: DEBUG nova.virt.hardware [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1004.120870] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7652d22-a150-47bb-b095-0cc7c1b212ea {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.124181] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Successfully created port: 6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1004.132283] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fa6b5e3-7635-4abf-b4a9-01e4f05533b2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.193291] env[66583]: DEBUG oslo_vmware.api [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Task: {'id': task-3470354, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069369} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1004.193609] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1004.193797] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1004.193968] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1004.194157] env[66583]: INFO nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Took 0.63 seconds to destroy the instance on the hypervisor. [ 1004.196629] env[66583]: DEBUG nova.compute.claims [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1004.196791] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1004.197009] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1004.216284] env[66583]: DEBUG nova.policy [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '85c095c1663344a5817d3e35eef81dd4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fb348123191d492f8cf6a6bd7f8ca357', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1004.222419] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.223394] env[66583]: DEBUG nova.compute.utils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1004.224821] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1004.224989] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1004.225201] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1004.225362] env[66583]: DEBUG nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1004.225523] env[66583]: DEBUG nova.network.neutron [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1004.443214] env[66583]: DEBUG neutronclient.v2_0.client [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1004.444767] env[66583]: ERROR nova.compute.manager [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] result = getattr(controller, method)(*args, **kwargs) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._get(image_id) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] resp, body = self.http_client.get(url, headers=header) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.request(url, 'GET', **kwargs) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._handle_response(resp) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exc.from_response(resp, resp.content) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] During handling of the above exception, another exception occurred: [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.driver.spawn(context, instance, image_meta, [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._fetch_image_if_missing(context, vi) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image_fetch(context, vi, tmp_image_ds_loc) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] images.fetch_image( [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] metadata = IMAGE_API.get(context, image_ref) [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return session.show(context, image_id, [ 1004.444767] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] _reraise_translated_image_exception(image_id) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise new_exc.with_traceback(exc_trace) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] result = getattr(controller, method)(*args, **kwargs) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._get(image_id) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] resp, body = self.http_client.get(url, headers=header) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.request(url, 'GET', **kwargs) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._handle_response(resp) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exc.from_response(resp, resp.content) [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] During handling of the above exception, another exception occurred: [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._build_and_run_instance(context, instance, image, [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] with excutils.save_and_reraise_exception(): [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.force_reraise() [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise self.value [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] with self.rt.instance_claim(context, instance, node, allocs, [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.abort() [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1004.447572] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return f(*args, **kwargs) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._unset_instance_host_and_node(instance) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] instance.save() [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] updates, result = self.indirection_api.object_action( [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return cctxt.call(context, 'object_action', objinst=objinst, [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] result = self.transport._send( [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._driver.send(target, ctxt, message, [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise result [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] nova.exception_Remote.InstanceNotFound_Remote: Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 could not be found. [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return getattr(target, method)(*args, **kwargs) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return fn(self, *args, **kwargs) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] old_ref, inst_ref = db.instance_update_and_get_original( [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return f(*args, **kwargs) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] with excutils.save_and_reraise_exception() as ectxt: [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.force_reraise() [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise self.value [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return f(*args, **kwargs) [ 1004.450149] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return f(context, *args, **kwargs) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exception.InstanceNotFound(instance_id=uuid) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] nova.exception.InstanceNotFound: Instance 6deed686-ceca-45a1-b8e4-2461b2e3f039 could not be found. [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] During handling of the above exception, another exception occurred: [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] exception_handler_v20(status_code, error_body) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise client_exc(message=error_message, [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Neutron server returns request_ids: ['req-7db5599b-9dff-49f3-83f8-d8662ceeadb6'] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] During handling of the above exception, another exception occurred: [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Traceback (most recent call last): [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._deallocate_network(context, instance, requested_networks) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self.network_api.deallocate_for_instance( [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] data = neutron.list_ports(**search_opts) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.list('ports', self.ports_path, retrieve_all, [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1004.451731] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] for r in self._pagination(collection, path, **params): [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] res = self.get(path, params=params) [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.retry_request("GET", action, body=body, [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] return self.do_request(method, action, body=body, [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] ret = obj(*args, **kwargs) [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] self._handle_fault_response(status_code, replybody, resp) [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] raise exception.Unauthorized() [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] nova.exception.Unauthorized: Not authorized. [ 1004.455074] env[66583]: ERROR nova.compute.manager [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] [ 1004.464054] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1004.464232] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Creating directory with path [datastore2] vmware_temp/0e63d6f1-182b-4ccc-bfdc-ab5be00b9b60/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1004.466551] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-71e2ac39-52bc-49a7-b367-5d5a94ca2823 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.471825] env[66583]: DEBUG nova.compute.manager [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Received event network-vif-plugged-06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1004.472517] env[66583]: DEBUG oslo_concurrency.lockutils [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] Acquiring lock "68449c86-cda6-46ff-a349-c2072829257e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1004.472897] env[66583]: DEBUG oslo_concurrency.lockutils [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] Lock "68449c86-cda6-46ff-a349-c2072829257e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1004.473257] env[66583]: DEBUG oslo_concurrency.lockutils [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] Lock "68449c86-cda6-46ff-a349-c2072829257e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.473701] env[66583]: DEBUG nova.compute.manager [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] No waiting events found dispatching network-vif-plugged-06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1004.474087] env[66583]: WARNING nova.compute.manager [req-29fae090-12a1-4277-a76b-ff8999cecb92 req-2ff4e457-5291-46c3-aa6f-da595be0d003 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Received unexpected event network-vif-plugged-06baa48e-0c25-4b8b-9381-d08a4a23a21b for instance with vm_state building and task_state spawning. [ 1004.476653] env[66583]: DEBUG oslo_concurrency.lockutils [None req-a975f736-161e-4696-9d92-633a98623f6d tempest-AttachVolumeNegativeTest-1676731590 tempest-AttachVolumeNegativeTest-1676731590-project-member] Lock "6deed686-ceca-45a1-b8e4-2461b2e3f039" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 309.342s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.484859] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Created directory with path [datastore2] vmware_temp/0e63d6f1-182b-4ccc-bfdc-ab5be00b9b60/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1004.486104] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Fetch image to [datastore2] vmware_temp/0e63d6f1-182b-4ccc-bfdc-ab5be00b9b60/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1004.486364] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/0e63d6f1-182b-4ccc-bfdc-ab5be00b9b60/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1004.487881] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecf302b6-5cc7-49c8-bb25-a3a6a8541fe5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.495908] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1004.498968] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-395179b6-9787-4162-a814-c88073cd2f73 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.510103] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ba8360-92c4-4b9c-923d-9cc1cc8ac133 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.543869] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Successfully updated port: 06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1004.552040] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f06d836e-7b26-4f8c-a90d-8b3527789f46 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.557700] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1004.557855] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquired lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1004.557976] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1004.564588] env[66583]: DEBUG oslo_vmware.api [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Task: {'id': task-3470356, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067196} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1004.566671] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1004.566671] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1004.566671] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1004.566671] env[66583]: INFO nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1004.569214] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c934d47c-654d-4474-9e2b-dc28abac0ef4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.570817] env[66583]: DEBUG nova.compute.claims [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1004.570988] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1004.571215] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1004.580150] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1004.599216] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1004.605347] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.034s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.605898] env[66583]: DEBUG nova.compute.utils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance 9915557d-4251-44a2-bf59-3dd542dfb527 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1004.607434] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.027s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1004.609431] env[66583]: INFO nova.compute.claims [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1004.612275] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1004.612415] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1004.612579] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1004.612745] env[66583]: DEBUG nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1004.612914] env[66583]: DEBUG nova.network.neutron [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1004.627764] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1004.676316] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Successfully created port: 1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1004.770454] env[66583]: DEBUG neutronclient.v2_0.client [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1004.772025] env[66583]: ERROR nova.compute.manager [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] result = getattr(controller, method)(*args, **kwargs) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._get(image_id) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] resp, body = self.http_client.get(url, headers=header) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.request(url, 'GET', **kwargs) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._handle_response(resp) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exc.from_response(resp, resp.content) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] During handling of the above exception, another exception occurred: [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.driver.spawn(context, instance, image_meta, [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._fetch_image_if_missing(context, vi) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image_fetch(context, vi, tmp_image_ds_loc) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] images.fetch_image( [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] metadata = IMAGE_API.get(context, image_ref) [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return session.show(context, image_id, [ 1004.772025] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] _reraise_translated_image_exception(image_id) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise new_exc.with_traceback(exc_trace) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] result = getattr(controller, method)(*args, **kwargs) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._get(image_id) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] resp, body = self.http_client.get(url, headers=header) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.request(url, 'GET', **kwargs) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._handle_response(resp) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exc.from_response(resp, resp.content) [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] During handling of the above exception, another exception occurred: [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._build_and_run_instance(context, instance, image, [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] with excutils.save_and_reraise_exception(): [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.force_reraise() [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise self.value [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] with self.rt.instance_claim(context, instance, node, allocs, [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.abort() [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1004.773065] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return f(*args, **kwargs) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._unset_instance_host_and_node(instance) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] instance.save() [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] updates, result = self.indirection_api.object_action( [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return cctxt.call(context, 'object_action', objinst=objinst, [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] result = self.transport._send( [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._driver.send(target, ctxt, message, [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise result [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] nova.exception_Remote.InstanceNotFound_Remote: Instance 9915557d-4251-44a2-bf59-3dd542dfb527 could not be found. [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return getattr(target, method)(*args, **kwargs) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return fn(self, *args, **kwargs) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] old_ref, inst_ref = db.instance_update_and_get_original( [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return f(*args, **kwargs) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] with excutils.save_and_reraise_exception() as ectxt: [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.force_reraise() [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise self.value [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return f(*args, **kwargs) [ 1004.774164] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return f(context, *args, **kwargs) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exception.InstanceNotFound(instance_id=uuid) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] nova.exception.InstanceNotFound: Instance 9915557d-4251-44a2-bf59-3dd542dfb527 could not be found. [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] During handling of the above exception, another exception occurred: [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] exception_handler_v20(status_code, error_body) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise client_exc(message=error_message, [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Neutron server returns request_ids: ['req-25cdad35-56a3-4569-b043-526453b602b9'] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] During handling of the above exception, another exception occurred: [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Traceback (most recent call last): [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._deallocate_network(context, instance, requested_networks) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self.network_api.deallocate_for_instance( [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] data = neutron.list_ports(**search_opts) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.list('ports', self.ports_path, retrieve_all, [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1004.775406] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] for r in self._pagination(collection, path, **params): [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] res = self.get(path, params=params) [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.retry_request("GET", action, body=body, [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] return self.do_request(method, action, body=body, [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] ret = obj(*args, **kwargs) [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] self._handle_fault_response(status_code, replybody, resp) [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] raise exception.Unauthorized() [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] nova.exception.Unauthorized: Not authorized. [ 1004.776490] env[66583]: ERROR nova.compute.manager [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] [ 1004.799355] env[66583]: DEBUG oslo_concurrency.lockutils [None req-060ceef9-9c1c-4659-94dd-127971d0d402 tempest-ImagesOneServerNegativeTestJSON-1770191682 tempest-ImagesOneServerNegativeTestJSON-1770191682-project-member] Lock "9915557d-4251-44a2-bf59-3dd542dfb527" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.316s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.805053] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1004.807439] env[66583]: ERROR nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] result = getattr(controller, method)(*args, **kwargs) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._get(image_id) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] resp, body = self.http_client.get(url, headers=header) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.request(url, 'GET', **kwargs) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._handle_response(resp) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exc.from_response(resp, resp.content) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] During handling of the above exception, another exception occurred: [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] yield resources [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.driver.spawn(context, instance, image_meta, [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._fetch_image_if_missing(context, vi) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image_fetch(context, vi, tmp_image_ds_loc) [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] images.fetch_image( [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1004.807439] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] metadata = IMAGE_API.get(context, image_ref) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return session.show(context, image_id, [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] _reraise_translated_image_exception(image_id) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise new_exc.with_traceback(exc_trace) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] result = getattr(controller, method)(*args, **kwargs) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._get(image_id) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] resp, body = self.http_client.get(url, headers=header) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.request(url, 'GET', **kwargs) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._handle_response(resp) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exc.from_response(resp, resp.content) [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1004.808720] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1004.808720] env[66583]: INFO nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Terminating instance [ 1004.808720] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1004.808720] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1004.810436] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-82e78fa4-e1ef-4b3d-b23b-9d80f8f357eb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.815292] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1004.815292] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1004.815292] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1004.821047] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fc2fe3e-d37d-43ea-9514-2e4949996b5f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.829346] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1004.831852] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-780461a1-dea9-4f9c-883b-db13b4c08747 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.835933] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1004.836883] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1004.837168] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c18b5394-b122-4ca0-bac8-bcf258f7bb9b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.843699] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Waiting for the task: (returnval){ [ 1004.843699] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52fbcbb0-f48b-7cbf-2a5a-ad993731c5fb" [ 1004.843699] env[66583]: _type = "Task" [ 1004.843699] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1004.848976] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13453e9b-ea77-4a4f-a3a6-ee98c8d4e5a1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.861119] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1004.861362] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Creating directory with path [datastore2] vmware_temp/75445850-2e0b-4683-97d9-c373d5c956c8/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1004.862225] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50b1872e-6bff-412e-acbf-ed98a5db5b72 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.865463] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4fa8c237-f94a-4cf1-b06d-eaa4a7eef74a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.897659] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1004.899245] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eca02554-579a-46f2-970b-c2fbfeb23581 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.901684] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1004.901877] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1004.902061] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Deleting the datastore file [datastore2] 83ac0082-b7fe-408d-9d5a-6e614ae7e61a {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1004.902316] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Created directory with path [datastore2] vmware_temp/75445850-2e0b-4683-97d9-c373d5c956c8/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1004.902496] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Fetch image to [datastore2] vmware_temp/75445850-2e0b-4683-97d9-c373d5c956c8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1004.902655] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore2] vmware_temp/75445850-2e0b-4683-97d9-c373d5c956c8/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore2 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1004.902860] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-228a28ef-268d-4317-9aca-e4b9476aed0f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.904866] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b14c99c1-2d90-44d0-a475-52cfbe16a0a0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.914557] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db20a819-74d6-4262-ac4b-4742e05a37c7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.918423] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Waiting for the task: (returnval){ [ 1004.918423] env[66583]: value = "task-3470358" [ 1004.918423] env[66583]: _type = "Task" [ 1004.918423] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1004.919296] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e23b77d3-d82e-4d28-8a64-c6ccaa01fcd8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.932134] env[66583]: DEBUG nova.compute.provider_tree [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1004.939157] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2e89d6d-f2b3-4e43-822a-b8f48ba006eb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.948458] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Task: {'id': task-3470358, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1004.949498] env[66583]: DEBUG nova.scheduler.client.report [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1004.985903] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.378s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1004.986434] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1004.989179] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1239b4d0-6e7f-485b-bdeb-e029c5300f86 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1004.991966] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.094s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1004.993340] env[66583]: INFO nova.compute.claims [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1005.000402] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-33b7f0dd-a2a7-4018-9c8f-f33765826e7d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.021483] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore2 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1005.024926] env[66583]: DEBUG nova.compute.utils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1005.026552] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1005.026688] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1005.038757] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1005.115945] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1005.125482] env[66583]: DEBUG nova.policy [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '747f0b91c4fe46c49fe1a390053463e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d88e984e6674da9a33c06705de9d7e2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1005.168167] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1005.168410] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1005.168568] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1005.168801] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1005.168888] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1005.169141] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1005.169418] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1005.169589] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1005.169822] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1005.170029] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1005.170211] env[66583]: DEBUG nova.virt.hardware [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1005.171375] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db520538-4207-485d-ac05-f2d96f3e8b41 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.174661] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1005.175408] env[66583]: ERROR nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] result = getattr(controller, method)(*args, **kwargs) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._get(image_id) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] resp, body = self.http_client.get(url, headers=header) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.request(url, 'GET', **kwargs) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._handle_response(resp) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exc.from_response(resp, resp.content) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] During handling of the above exception, another exception occurred: [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] yield resources [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.driver.spawn(context, instance, image_meta, [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._fetch_image_if_missing(context, vi) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image_fetch(context, vi, tmp_image_ds_loc) [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] images.fetch_image( [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1005.175408] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] metadata = IMAGE_API.get(context, image_ref) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return session.show(context, image_id, [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] _reraise_translated_image_exception(image_id) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise new_exc.with_traceback(exc_trace) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] result = getattr(controller, method)(*args, **kwargs) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._get(image_id) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] resp, body = self.http_client.get(url, headers=header) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.request(url, 'GET', **kwargs) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._handle_response(resp) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exc.from_response(resp, resp.content) [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1005.176435] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1005.176435] env[66583]: INFO nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Terminating instance [ 1005.183684] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1005.183890] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1005.185223] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f041ec9a-056f-4a01-8bf4-2e6ac77e1366 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.192378] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27bbfa27-210c-4bf2-b97e-a0ce9e1fbd5b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.201021] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1005.201738] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-48083b29-2a00-4e8e-a6bd-52097bf7f090 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.224598] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caeddf9c-f8ca-4f68-82b6-e59ab5263c22 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.231416] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f89e54a-5b70-4a62-9cc0-0c4b89a34380 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.264296] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-755c9b8b-559f-45b4-a2a6-1d8ab274020f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.267112] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1005.267329] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Deleting contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1005.267508] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Deleting the datastore file [datastore2] a14582eb-f78f-44d6-8c82-16976c0cec5b {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1005.267740] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-da061368-e29f-4b9d-b93c-7d9436ccbd21 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.275108] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1700a787-dcbb-47d8-8dfa-d6cea2a7b0c8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.278764] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Waiting for the task: (returnval){ [ 1005.278764] env[66583]: value = "task-3470360" [ 1005.278764] env[66583]: _type = "Task" [ 1005.278764] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1005.290892] env[66583]: DEBUG nova.compute.provider_tree [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1005.296313] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Task: {'id': task-3470360, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1005.299741] env[66583]: DEBUG nova.scheduler.client.report [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1005.312538] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.321s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1005.313061] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1005.347559] env[66583]: DEBUG nova.compute.utils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1005.349232] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1005.349407] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1005.358983] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1005.410532] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Updating instance_info_cache with network_info: [{"id": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "address": "fa:16:3e:a5:b3:9b", "network": {"id": "ee5be6c6-6873-4f32-ae75-f00e9b5a25eb", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1482290419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60cc48aabaaf4189a35327c52cfdfce0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06baa48e-0c", "ovs_interfaceid": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1005.428585] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Releasing lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1005.429098] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance network_info: |[{"id": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "address": "fa:16:3e:a5:b3:9b", "network": {"id": "ee5be6c6-6873-4f32-ae75-f00e9b5a25eb", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1482290419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60cc48aabaaf4189a35327c52cfdfce0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06baa48e-0c", "ovs_interfaceid": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1005.430569] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a5:b3:9b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '24210a23-d8ac-4f4f-84ac-dc0636de9a72', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '06baa48e-0c25-4b8b-9381-d08a4a23a21b', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1005.444586] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Creating folder: Project (60cc48aabaaf4189a35327c52cfdfce0). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.450911] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2424fe65-0792-4852-80be-f89a91cac0c6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.456893] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1005.457781] env[66583]: DEBUG oslo_vmware.api [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Task: {'id': task-3470358, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.089808} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1005.458378] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1005.458633] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1005.458891] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1005.459114] env[66583]: INFO nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1005.461512] env[66583]: DEBUG nova.compute.claims [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1005.461620] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1005.461833] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1005.468821] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Created folder: Project (60cc48aabaaf4189a35327c52cfdfce0) in parent group-v693485. [ 1005.469012] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Creating folder: Instances. Parent ref: group-v693557. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1005.469253] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85e2f928-9626-44a1-bcd5-8dfef07b1ed8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.477862] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Created folder: Instances in parent group-v693557. [ 1005.478109] env[66583]: DEBUG oslo.service.loopingcall [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1005.478293] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1005.478495] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-07c61b7a-338b-41fd-944f-aa535a3f5999 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.495446] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1005.495694] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1005.495848] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1005.496097] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1005.496243] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1005.496396] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1005.496609] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1005.496769] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1005.499995] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1005.499995] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1005.499995] env[66583]: DEBUG nova.virt.hardware [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1005.499995] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.036s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1005.499995] env[66583]: DEBUG nova.compute.utils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1005.500298] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1332d1ea-c54b-43df-8542-ac4dadf2d73d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.502873] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1005.503055] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1005.503260] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1005.503513] env[66583]: DEBUG nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1005.503692] env[66583]: DEBUG nova.network.neutron [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1005.511779] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39341728-dba2-465f-b9d9-bc3e1d63cc59 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1005.515762] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1005.515762] env[66583]: value = "task-3470363" [ 1005.515762] env[66583]: _type = "Task" [ 1005.515762] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1005.531951] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470363, 'name': CreateVM_Task} progress is 10%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1005.545316] env[66583]: DEBUG nova.policy [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '747f0b91c4fe46c49fe1a390053463e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d88e984e6674da9a33c06705de9d7e2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1005.621142] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Successfully created port: 2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1005.734331] env[66583]: DEBUG neutronclient.v2_0.client [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1005.736107] env[66583]: ERROR nova.compute.manager [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] result = getattr(controller, method)(*args, **kwargs) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._get(image_id) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] resp, body = self.http_client.get(url, headers=header) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.request(url, 'GET', **kwargs) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._handle_response(resp) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exc.from_response(resp, resp.content) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] During handling of the above exception, another exception occurred: [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.driver.spawn(context, instance, image_meta, [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._fetch_image_if_missing(context, vi) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image_fetch(context, vi, tmp_image_ds_loc) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] images.fetch_image( [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] metadata = IMAGE_API.get(context, image_ref) [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return session.show(context, image_id, [ 1005.736107] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] _reraise_translated_image_exception(image_id) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise new_exc.with_traceback(exc_trace) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] result = getattr(controller, method)(*args, **kwargs) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._get(image_id) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] resp, body = self.http_client.get(url, headers=header) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.request(url, 'GET', **kwargs) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._handle_response(resp) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exc.from_response(resp, resp.content) [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] During handling of the above exception, another exception occurred: [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._build_and_run_instance(context, instance, image, [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] with excutils.save_and_reraise_exception(): [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.force_reraise() [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise self.value [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] with self.rt.instance_claim(context, instance, node, allocs, [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.abort() [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1005.737147] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return f(*args, **kwargs) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._unset_instance_host_and_node(instance) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] instance.save() [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] updates, result = self.indirection_api.object_action( [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] result = self.transport._send( [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._driver.send(target, ctxt, message, [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise result [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] nova.exception_Remote.InstanceNotFound_Remote: Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a could not be found. [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return getattr(target, method)(*args, **kwargs) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return fn(self, *args, **kwargs) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return f(*args, **kwargs) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] with excutils.save_and_reraise_exception() as ectxt: [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.force_reraise() [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise self.value [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return f(*args, **kwargs) [ 1005.738294] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return f(context, *args, **kwargs) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exception.InstanceNotFound(instance_id=uuid) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] nova.exception.InstanceNotFound: Instance 83ac0082-b7fe-408d-9d5a-6e614ae7e61a could not be found. [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] During handling of the above exception, another exception occurred: [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] exception_handler_v20(status_code, error_body) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise client_exc(message=error_message, [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Neutron server returns request_ids: ['req-d6841789-b2b3-4925-a582-58ff233e8545'] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] During handling of the above exception, another exception occurred: [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Traceback (most recent call last): [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._deallocate_network(context, instance, requested_networks) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self.network_api.deallocate_for_instance( [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] data = neutron.list_ports(**search_opts) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.list('ports', self.ports_path, retrieve_all, [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1005.739352] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] for r in self._pagination(collection, path, **params): [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] res = self.get(path, params=params) [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.retry_request("GET", action, body=body, [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] return self.do_request(method, action, body=body, [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] ret = obj(*args, **kwargs) [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] self._handle_fault_response(status_code, replybody, resp) [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] raise exception.Unauthorized() [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] nova.exception.Unauthorized: Not authorized. [ 1005.745457] env[66583]: ERROR nova.compute.manager [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] [ 1005.769344] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8c21c279-8ee6-48ca-8209-427898aa1432 tempest-ImagesOneServerTestJSON-483119449 tempest-ImagesOneServerTestJSON-483119449-project-member] Lock "83ac0082-b7fe-408d-9d5a-6e614ae7e61a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 306.442s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1005.785292] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1005.794071] env[66583]: DEBUG oslo_vmware.api [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Task: {'id': task-3470360, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066319} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1005.794071] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1005.794071] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Deleted contents of the VM from datastore datastore2 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1005.794071] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1005.794781] env[66583]: INFO nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1005.796068] env[66583]: DEBUG nova.compute.claims [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1005.796242] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1005.799696] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1005.832214] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.036s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1005.832879] env[66583]: DEBUG nova.compute.utils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance a14582eb-f78f-44d6-8c82-16976c0cec5b could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1005.844528] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1005.844715] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1005.844883] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1005.845065] env[66583]: DEBUG nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1005.845262] env[66583]: DEBUG nova.network.neutron [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1005.852611] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1005.852875] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1005.854400] env[66583]: INFO nova.compute.claims [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1006.035841] env[66583]: DEBUG neutronclient.v2_0.client [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1006.038997] env[66583]: ERROR nova.compute.manager [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] result = getattr(controller, method)(*args, **kwargs) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._get(image_id) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] resp, body = self.http_client.get(url, headers=header) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.request(url, 'GET', **kwargs) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._handle_response(resp) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exc.from_response(resp, resp.content) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] During handling of the above exception, another exception occurred: [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.driver.spawn(context, instance, image_meta, [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._fetch_image_if_missing(context, vi) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image_fetch(context, vi, tmp_image_ds_loc) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] images.fetch_image( [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] metadata = IMAGE_API.get(context, image_ref) [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return session.show(context, image_id, [ 1006.038997] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] _reraise_translated_image_exception(image_id) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise new_exc.with_traceback(exc_trace) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] result = getattr(controller, method)(*args, **kwargs) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._get(image_id) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] resp, body = self.http_client.get(url, headers=header) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.request(url, 'GET', **kwargs) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._handle_response(resp) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exc.from_response(resp, resp.content) [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] During handling of the above exception, another exception occurred: [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._build_and_run_instance(context, instance, image, [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] with excutils.save_and_reraise_exception(): [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.force_reraise() [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise self.value [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] with self.rt.instance_claim(context, instance, node, allocs, [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.abort() [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1006.039990] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return f(*args, **kwargs) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._unset_instance_host_and_node(instance) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] instance.save() [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] updates, result = self.indirection_api.object_action( [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return cctxt.call(context, 'object_action', objinst=objinst, [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] result = self.transport._send( [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._driver.send(target, ctxt, message, [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise result [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] nova.exception_Remote.InstanceNotFound_Remote: Instance a14582eb-f78f-44d6-8c82-16976c0cec5b could not be found. [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return getattr(target, method)(*args, **kwargs) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return fn(self, *args, **kwargs) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] old_ref, inst_ref = db.instance_update_and_get_original( [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return f(*args, **kwargs) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] with excutils.save_and_reraise_exception() as ectxt: [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.force_reraise() [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise self.value [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return f(*args, **kwargs) [ 1006.040879] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return f(context, *args, **kwargs) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exception.InstanceNotFound(instance_id=uuid) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] nova.exception.InstanceNotFound: Instance a14582eb-f78f-44d6-8c82-16976c0cec5b could not be found. [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] During handling of the above exception, another exception occurred: [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] exception_handler_v20(status_code, error_body) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise client_exc(message=error_message, [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Neutron server returns request_ids: ['req-6065e8e8-5738-4bde-852d-37981aba65be'] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] During handling of the above exception, another exception occurred: [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Traceback (most recent call last): [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._deallocate_network(context, instance, requested_networks) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self.network_api.deallocate_for_instance( [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] data = neutron.list_ports(**search_opts) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.list('ports', self.ports_path, retrieve_all, [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1006.041846] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] for r in self._pagination(collection, path, **params): [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] res = self.get(path, params=params) [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.retry_request("GET", action, body=body, [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] return self.do_request(method, action, body=body, [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] ret = obj(*args, **kwargs) [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] self._handle_fault_response(status_code, replybody, resp) [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] raise exception.Unauthorized() [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] nova.exception.Unauthorized: Not authorized. [ 1006.046375] env[66583]: ERROR nova.compute.manager [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] [ 1006.048208] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470363, 'name': CreateVM_Task, 'duration_secs': 0.281506} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1006.048456] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1006.049281] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.049281] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.049591] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1006.049801] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c0374db8-0516-469c-95e3-d336e465e7d2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.054898] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Waiting for the task: (returnval){ [ 1006.054898] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]520f97ef-ea8f-6661-f63d-e82eac677bb4" [ 1006.054898] env[66583]: _type = "Task" [ 1006.054898] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.070771] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b6b00db7-c85f-4645-af84-3525f2a3c8c0 tempest-ServersTestJSON-2037535159 tempest-ServersTestJSON-2037535159-project-member] Lock "a14582eb-f78f-44d6-8c82-16976c0cec5b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 304.488s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.071040] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.071272] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1006.071473] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.080329] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65cd7859-c492-4824-b267-9de135897704 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.088734] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b9354ec-9392-4efe-8120-f81a2511cb59 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.092071] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1006.127567] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Successfully updated port: 6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1006.129195] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60c6a071-6b0d-445d-bb42-1b6ebbf9212d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.142692] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.142846] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquired lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.142998] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1006.145072] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8ad728c-3508-4286-be31-26056f4fff0a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.162513] env[66583]: DEBUG nova.compute.provider_tree [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1006.166444] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.173914] env[66583]: DEBUG nova.scheduler.client.report [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1006.189491] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.336s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.189987] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1006.192261] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.026s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.194518] env[66583]: INFO nova.compute.claims [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1006.237430] env[66583]: DEBUG nova.compute.manager [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Received event network-vif-plugged-2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1006.237430] env[66583]: DEBUG oslo_concurrency.lockutils [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] Acquiring lock "89ccce06-2094-4f87-a77b-cad92d351dfa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.237430] env[66583]: DEBUG oslo_concurrency.lockutils [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] Lock "89ccce06-2094-4f87-a77b-cad92d351dfa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.237629] env[66583]: DEBUG oslo_concurrency.lockutils [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] Lock "89ccce06-2094-4f87-a77b-cad92d351dfa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.237777] env[66583]: DEBUG nova.compute.manager [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] No waiting events found dispatching network-vif-plugged-2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1006.237944] env[66583]: WARNING nova.compute.manager [req-cbf0c3f9-cb32-4e1e-8a64-9e09f6bc2fb2 req-73a3a8a4-0d49-4e65-b108-a0b0276ffdaf service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Received unexpected event network-vif-plugged-2e991e84-4f4a-4046-903b-070d9f4fc9bd for instance with vm_state building and task_state spawning. [ 1006.239742] env[66583]: DEBUG nova.compute.utils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1006.241399] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1006.241573] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1006.250986] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1006.277764] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1006.300302] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Successfully updated port: 2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1006.308857] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.308957] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.309121] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1006.314898] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1006.349033] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1006.352904] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1006.353104] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1006.353247] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1006.353432] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1006.353577] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1006.353759] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1006.353983] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1006.354156] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1006.354322] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1006.354484] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1006.354649] env[66583]: DEBUG nova.virt.hardware [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1006.355944] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b649e31-c220-4189-b5e9-1fd3bbc38116 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.368381] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32d764e4-202e-4ddd-921f-3083ef8d86db {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.424101] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0315aabf-8e47-4a3e-8b58-35c2496597dc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.431258] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6b89d98-f0c9-41ed-86b5-730d78927a5e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.466986] env[66583]: DEBUG nova.policy [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '747f0b91c4fe46c49fe1a390053463e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3d88e984e6674da9a33c06705de9d7e2', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1006.468697] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01d5601f-50ff-4821-8a51-ae72faaae8e9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.476587] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d295cf17-1e82-47c0-a73e-dc7589afa018 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.489990] env[66583]: DEBUG nova.compute.provider_tree [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1006.500815] env[66583]: DEBUG nova.scheduler.client.report [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1006.516023] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Updating instance_info_cache with network_info: [{"id": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "address": "fa:16:3e:f0:c6:d2", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e991e84-4f", "ovs_interfaceid": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.517957] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.518424] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1006.522737] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Received event network-changed-06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1006.522925] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Refreshing instance network info cache due to event network-changed-06baa48e-0c25-4b8b-9381-d08a4a23a21b. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1006.523177] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Acquiring lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1006.523356] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Acquired lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1006.523905] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Refreshing network info cache for port 06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1006.525748] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.526028] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance network_info: |[{"id": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "address": "fa:16:3e:f0:c6:d2", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e991e84-4f", "ovs_interfaceid": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1006.526351] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f0:c6:d2', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac2c9d07-ed01-47a9-88f1-562992bc1076', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2e991e84-4f4a-4046-903b-070d9f4fc9bd', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1006.533981] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating folder: Project (3d88e984e6674da9a33c06705de9d7e2). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1006.534483] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-41524c5f-d7b0-4f84-ae0c-50b21f63ff8c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.548531] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created folder: Project (3d88e984e6674da9a33c06705de9d7e2) in parent group-v693485. [ 1006.548735] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating folder: Instances. Parent ref: group-v693560. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1006.548962] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d574ff7d-74e5-49f7-87be-37371c0564a0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.554078] env[66583]: DEBUG nova.compute.utils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1006.555302] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1006.555483] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1006.561282] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created folder: Instances in parent group-v693560. [ 1006.561538] env[66583]: DEBUG oslo.service.loopingcall [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1006.561884] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1006.561952] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e2261962-0b0a-4d97-8ecb-0bf9117ec42a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.578644] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1006.587991] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1006.587991] env[66583]: value = "task-3470366" [ 1006.587991] env[66583]: _type = "Task" [ 1006.587991] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1006.595958] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470366, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1006.663712] env[66583]: DEBUG nova.policy [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3905566ae8314d40a601efb54a37ad26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '820b7d141569446ead1901b8442f8184', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1006.667838] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1006.698111] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1006.698686] env[66583]: DEBUG nova.virt.hardware [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1006.699636] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-031ca165-462f-4266-8e91-cd70301c1125 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.710397] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-947853af-2c5e-4eab-8696-79beacc95dd1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1006.730513] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Successfully created port: 4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1006.978944] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Updated VIF entry in instance network info cache for port 06baa48e-0c25-4b8b-9381-d08a4a23a21b. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1006.979384] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Updating instance_info_cache with network_info: [{"id": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "address": "fa:16:3e:a5:b3:9b", "network": {"id": "ee5be6c6-6873-4f32-ae75-f00e9b5a25eb", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-1482290419-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "60cc48aabaaf4189a35327c52cfdfce0", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "24210a23-d8ac-4f4f-84ac-dc0636de9a72", "external-id": "nsx-vlan-transportzone-257", "segmentation_id": 257, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap06baa48e-0c", "ovs_interfaceid": "06baa48e-0c25-4b8b-9381-d08a4a23a21b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1006.988809] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Releasing lock "refresh_cache-68449c86-cda6-46ff-a349-c2072829257e" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.989355] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Received event network-vif-plugged-6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1006.989608] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Acquiring lock "035e8729-c02f-490e-a0e4-b8877b52e75b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1006.989849] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Lock "035e8729-c02f-490e-a0e4-b8877b52e75b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1006.990067] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Lock "035e8729-c02f-490e-a0e4-b8877b52e75b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1006.990277] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] No waiting events found dispatching network-vif-plugged-6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1006.990913] env[66583]: WARNING nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Received unexpected event network-vif-plugged-6709862d-6896-4eaf-a065-2e2174f26dbd for instance with vm_state building and task_state spawning. [ 1006.991169] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Received event network-changed-6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1006.991380] env[66583]: DEBUG nova.compute.manager [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Refreshing instance network info cache due to event network-changed-6709862d-6896-4eaf-a065-2e2174f26dbd. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1006.991587] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Acquiring lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.086714] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Successfully created port: 50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1007.099337] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470366, 'name': CreateVM_Task, 'duration_secs': 0.298827} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1007.099337] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1007.099570] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.099767] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.100110] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1007.100361] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3eef897-2a6d-41df-947a-7e6722b2bc7d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.105579] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1007.105579] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52817032-a326-fa49-f298-66d9a02cde25" [ 1007.105579] env[66583]: _type = "Task" [ 1007.105579] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1007.113979] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52817032-a326-fa49-f298-66d9a02cde25, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.164944] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Updating instance_info_cache with network_info: [{"id": "6709862d-6896-4eaf-a065-2e2174f26dbd", "address": "fa:16:3e:19:74:55", "network": {"id": "9bccade8-c7cf-4caf-9702-d00af21409ce", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-305316708-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e450d78a63a64fcda4b141e03517015c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aaf99dce-c773-48db-a2d9-00b8d0a7c75d", "external-id": "nsx-vlan-transportzone-248", "segmentation_id": 248, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6709862d-68", "ovs_interfaceid": "6709862d-6896-4eaf-a065-2e2174f26dbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.183230] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Releasing lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1007.183230] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance network_info: |[{"id": "6709862d-6896-4eaf-a065-2e2174f26dbd", "address": "fa:16:3e:19:74:55", "network": {"id": "9bccade8-c7cf-4caf-9702-d00af21409ce", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-305316708-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e450d78a63a64fcda4b141e03517015c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aaf99dce-c773-48db-a2d9-00b8d0a7c75d", "external-id": "nsx-vlan-transportzone-248", "segmentation_id": 248, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6709862d-68", "ovs_interfaceid": "6709862d-6896-4eaf-a065-2e2174f26dbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1007.183230] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Acquired lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.183488] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Refreshing network info cache for port 6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1007.184610] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:74:55', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'aaf99dce-c773-48db-a2d9-00b8d0a7c75d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6709862d-6896-4eaf-a065-2e2174f26dbd', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1007.195606] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Creating folder: Project (e450d78a63a64fcda4b141e03517015c). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1007.195606] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1f4dcab3-4cd5-4767-9e76-f26259e6ae80 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.207781] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Created folder: Project (e450d78a63a64fcda4b141e03517015c) in parent group-v693485. [ 1007.207978] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Creating folder: Instances. Parent ref: group-v693563. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1007.209178] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a7924e98-9f96-4f40-94d4-a56ff70fb44c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.217847] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Created folder: Instances in parent group-v693563. [ 1007.218119] env[66583]: DEBUG oslo.service.loopingcall [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1007.218308] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1007.218507] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2e541ab4-38a9-4a58-97a3-20d4a7ec098e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.240158] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1007.240158] env[66583]: value = "task-3470369" [ 1007.240158] env[66583]: _type = "Task" [ 1007.240158] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1007.242749] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Successfully updated port: 1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1007.250969] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470369, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.254764] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.254904] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquired lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.255138] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1007.435660] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1007.619150] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1007.619404] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1007.619622] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.637919] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Successfully updated port: 50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1007.648083] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.648232] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.648737] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1007.683394] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1007.751205] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470369, 'name': CreateVM_Task, 'duration_secs': 0.284593} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1007.754312] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1007.755340] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1007.755573] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1007.755906] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1007.756476] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-329a075a-da25-41f2-8a36-a642cad39023 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.761365] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Waiting for the task: (returnval){ [ 1007.761365] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]526fa235-8790-5288-f4a6-8a5808d1282c" [ 1007.761365] env[66583]: _type = "Task" [ 1007.761365] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1007.769480] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]526fa235-8790-5288-f4a6-8a5808d1282c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1007.842665] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Updating instance_info_cache with network_info: [{"id": "50e69df8-06bf-4d72-8d04-e3579b49628e", "address": "fa:16:3e:7b:c5:a3", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50e69df8-06", "ovs_interfaceid": "50e69df8-06bf-4d72-8d04-e3579b49628e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1007.863248] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1007.863248] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance network_info: |[{"id": "50e69df8-06bf-4d72-8d04-e3579b49628e", "address": "fa:16:3e:7b:c5:a3", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50e69df8-06", "ovs_interfaceid": "50e69df8-06bf-4d72-8d04-e3579b49628e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1007.863248] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7b:c5:a3', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ae4e3171-21cd-4094-b6cf-81bf366c75bd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '50e69df8-06bf-4d72-8d04-e3579b49628e', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1007.871791] env[66583]: DEBUG oslo.service.loopingcall [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1007.872646] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1007.872646] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-22264145-a5d4-4e11-8284-d944ae15b265 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1007.896043] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1007.896043] env[66583]: value = "task-3470370" [ 1007.896043] env[66583]: _type = "Task" [ 1007.896043] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1007.901400] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470370, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1008.051900] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Successfully created port: 5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1008.271675] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1008.271955] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1008.272157] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.378242] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Updating instance_info_cache with network_info: [{"id": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "address": "fa:16:3e:96:6c:1d", "network": {"id": "e53b4d47-5ad5-4a8f-b637-bb549e9ea5d7", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-460278418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fb348123191d492f8cf6a6bd7f8ca357", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b7a73c01-1bb9-4612-a1a7-16d71b732e81", "external-id": "nsx-vlan-transportzone-711", "segmentation_id": 711, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1598c54f-8b", "ovs_interfaceid": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.390674] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Releasing lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1008.390897] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance network_info: |[{"id": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "address": "fa:16:3e:96:6c:1d", "network": {"id": "e53b4d47-5ad5-4a8f-b637-bb549e9ea5d7", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-460278418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fb348123191d492f8cf6a6bd7f8ca357", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b7a73c01-1bb9-4612-a1a7-16d71b732e81", "external-id": "nsx-vlan-transportzone-711", "segmentation_id": 711, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1598c54f-8b", "ovs_interfaceid": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1008.391480] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:96:6c:1d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'b7a73c01-1bb9-4612-a1a7-16d71b732e81', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1598c54f-8bc9-4eca-94b4-f0bceefb9fd3', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1008.401135] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Creating folder: Project (fb348123191d492f8cf6a6bd7f8ca357). Parent ref: group-v693485. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1008.401817] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-32aff2d5-8204-4e5f-900b-ce75ccb764a4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.433172] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470370, 'name': CreateVM_Task, 'duration_secs': 0.303491} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1008.433172] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1008.433172] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.433172] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1008.433172] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1008.433886] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e84e7e46-30c7-4583-8fbf-5b29b2c95ce5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.439905] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 1008.439905] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]5251b525-de26-77c0-d5f0-91bf2c99858e" [ 1008.439905] env[66583]: _type = "Task" [ 1008.439905] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1008.445896] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Created folder: Project (fb348123191d492f8cf6a6bd7f8ca357) in parent group-v693485. [ 1008.446268] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Creating folder: Instances. Parent ref: group-v693567. {{(pid=66583) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1008.446958] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-51f14cd4-3d5c-426b-b2ef-d0012029b752 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.452341] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]5251b525-de26-77c0-d5f0-91bf2c99858e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1008.460408] env[66583]: INFO nova.virt.vmwareapi.vm_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Created folder: Instances in parent group-v693567. [ 1008.460762] env[66583]: DEBUG oslo.service.loopingcall [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1008.461093] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1008.461389] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-503fa4ef-2748-4ef7-b3e2-3bd9ac8b6eca {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.484017] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1008.484017] env[66583]: value = "task-3470373" [ 1008.484017] env[66583]: _type = "Task" [ 1008.484017] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1008.490441] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470373, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1008.506251] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Updated VIF entry in instance network info cache for port 6709862d-6896-4eaf-a065-2e2174f26dbd. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1008.506251] env[66583]: DEBUG nova.network.neutron [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Updating instance_info_cache with network_info: [{"id": "6709862d-6896-4eaf-a065-2e2174f26dbd", "address": "fa:16:3e:19:74:55", "network": {"id": "9bccade8-c7cf-4caf-9702-d00af21409ce", "bridge": "br-int", "label": "tempest-ServerPasswordTestJSON-305316708-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e450d78a63a64fcda4b141e03517015c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "aaf99dce-c773-48db-a2d9-00b8d0a7c75d", "external-id": "nsx-vlan-transportzone-248", "segmentation_id": 248, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6709862d-68", "ovs_interfaceid": "6709862d-6896-4eaf-a065-2e2174f26dbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1008.514781] env[66583]: DEBUG oslo_concurrency.lockutils [req-72f3cf72-2f6c-4c03-bac1-11610e54fad6 req-4dd75a92-a84a-4ca9-9439-c29d4a8c6834 service nova] Releasing lock "refresh_cache-035e8729-c02f-490e-a0e4-b8877b52e75b" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1008.634206] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Received event network-changed-2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1008.634556] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Refreshing instance network info cache due to event network-changed-2e991e84-4f4a-4046-903b-070d9f4fc9bd. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1008.634875] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Acquiring lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.635140] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Acquired lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1008.635451] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Refreshing network info cache for port 2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1008.922767] env[66583]: DEBUG nova.compute.manager [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Received event network-vif-plugged-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1008.922995] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Acquiring lock "e9136963-e0fc-4344-880b-a21549f2cf23-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1008.923297] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Lock "e9136963-e0fc-4344-880b-a21549f2cf23-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1008.923481] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Lock "e9136963-e0fc-4344-880b-a21549f2cf23-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1008.923649] env[66583]: DEBUG nova.compute.manager [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] No waiting events found dispatching network-vif-plugged-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1008.923813] env[66583]: WARNING nova.compute.manager [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Received unexpected event network-vif-plugged-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 for instance with vm_state building and task_state spawning. [ 1008.923974] env[66583]: DEBUG nova.compute.manager [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Received event network-changed-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1008.924142] env[66583]: DEBUG nova.compute.manager [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Refreshing instance network info cache due to event network-changed-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1008.924341] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Acquiring lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.924484] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Acquired lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1008.924640] env[66583]: DEBUG nova.network.neutron [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Refreshing network info cache for port 1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1008.950418] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1008.950688] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1008.950903] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.991646] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470373, 'name': CreateVM_Task, 'duration_secs': 0.285805} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1008.991749] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1008.992525] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1008.992689] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1008.992995] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1008.993484] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b675d64-506c-4b65-9a47-841d1b8b11de {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1009.004096] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Waiting for the task: (returnval){ [ 1009.004096] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52a54450-2c82-1230-ce8e-613570a0e570" [ 1009.004096] env[66583]: _type = "Task" [ 1009.004096] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1009.012319] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52a54450-2c82-1230-ce8e-613570a0e570, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1009.516267] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1009.516602] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1009.516689] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.713596] env[66583]: WARNING oslo_vmware.rw_handles [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1009.713596] env[66583]: ERROR oslo_vmware.rw_handles [ 1009.714303] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1009.715723] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1009.715956] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Copying Virtual Disk [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/a0fee2ad-de6c-4682-bba3-3a687d43a7f3/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1009.716251] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6948d072-3c98-4e14-bd89-b24fbc6bbfd9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1009.723992] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Waiting for the task: (returnval){ [ 1009.723992] env[66583]: value = "task-3470374" [ 1009.723992] env[66583]: _type = "Task" [ 1009.723992] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1009.731913] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Task: {'id': task-3470374, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1009.834040] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Updated VIF entry in instance network info cache for port 2e991e84-4f4a-4046-903b-070d9f4fc9bd. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1009.834443] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Updating instance_info_cache with network_info: [{"id": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "address": "fa:16:3e:f0:c6:d2", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2e991e84-4f", "ovs_interfaceid": "2e991e84-4f4a-4046-903b-070d9f4fc9bd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1009.837553] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Successfully updated port: 4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1009.845563] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.845699] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1009.845850] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1009.847292] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Releasing lock "refresh_cache-89ccce06-2094-4f87-a77b-cad92d351dfa" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1009.847518] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Received event network-vif-plugged-50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1009.847743] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Acquiring lock "08689558-cc57-43c5-b56e-f9785b515717-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1009.847936] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Lock "08689558-cc57-43c5-b56e-f9785b515717-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1009.852180] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Lock "08689558-cc57-43c5-b56e-f9785b515717-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1009.852404] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] No waiting events found dispatching network-vif-plugged-50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1009.852592] env[66583]: WARNING nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Received unexpected event network-vif-plugged-50e69df8-06bf-4d72-8d04-e3579b49628e for instance with vm_state building and task_state spawning. [ 1009.852759] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Received event network-changed-50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1009.852916] env[66583]: DEBUG nova.compute.manager [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Refreshing instance network info cache due to event network-changed-50e69df8-06bf-4d72-8d04-e3579b49628e. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1009.853138] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Acquiring lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1009.853345] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Acquired lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1009.853521] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Refreshing network info cache for port 50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1009.975930] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1010.167878] env[66583]: DEBUG nova.network.neutron [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Updated VIF entry in instance network info cache for port 1598c54f-8bc9-4eca-94b4-f0bceefb9fd3. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1010.168306] env[66583]: DEBUG nova.network.neutron [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Updating instance_info_cache with network_info: [{"id": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "address": "fa:16:3e:96:6c:1d", "network": {"id": "e53b4d47-5ad5-4a8f-b637-bb549e9ea5d7", "bridge": "br-int", "label": "tempest-ServersTestManualDisk-460278418-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "fb348123191d492f8cf6a6bd7f8ca357", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "b7a73c01-1bb9-4612-a1a7-16d71b732e81", "external-id": "nsx-vlan-transportzone-711", "segmentation_id": 711, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1598c54f-8b", "ovs_interfaceid": "1598c54f-8bc9-4eca-94b4-f0bceefb9fd3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1010.179646] env[66583]: DEBUG oslo_concurrency.lockutils [req-3203e63d-13bd-4341-9e48-289aa6256757 req-35314191-f1db-46f9-bc00-e734e704d08a service nova] Releasing lock "refresh_cache-e9136963-e0fc-4344-880b-a21549f2cf23" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1010.235725] env[66583]: DEBUG oslo_vmware.exceptions [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1010.238225] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1010.238662] env[66583]: ERROR nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1010.238662] env[66583]: Faults: ['InvalidArgument'] [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] Traceback (most recent call last): [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] yield resources [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self.driver.spawn(context, instance, image_meta, [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self._fetch_image_if_missing(context, vi) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] image_cache(vi, tmp_image_ds_loc) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] vm_util.copy_virtual_disk( [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] session._wait_for_task(vmdk_copy_task) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return self.wait_for_task(task_ref) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return evt.wait() [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] result = hub.switch() [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return self.greenlet.switch() [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self.f(*self.args, **self.kw) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] raise exceptions.translate_fault(task_info.error) [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] Faults: ['InvalidArgument'] [ 1010.238662] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] [ 1010.239652] env[66583]: INFO nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Terminating instance [ 1010.240331] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1010.240531] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1010.241359] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1010.241621] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1010.241859] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4d62d8b3-e078-466c-b702-ec509fb52c35 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.244575] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2a404da-1ac6-41ce-a6e6-869463035205 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.251575] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1010.251841] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0e8a963f-c56f-4c44-bf6c-77ba028347c0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.254703] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1010.254876] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1010.255933] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-50b0f359-5b3a-4087-a2a3-3481cb024e08 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.260811] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 1010.260811] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52db0401-d8e5-f29e-9e65-194a9f8d762d" [ 1010.260811] env[66583]: _type = "Task" [ 1010.260811] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1010.268591] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52db0401-d8e5-f29e-9e65-194a9f8d762d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1010.320551] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1010.320780] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1010.320956] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Deleting the datastore file [datastore1] e7664037-62b0-4195-b935-eab75d232f5d {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1010.325364] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9da6a1b8-4466-4479-871c-338388c8063b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.331194] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Waiting for the task: (returnval){ [ 1010.331194] env[66583]: value = "task-3470376" [ 1010.331194] env[66583]: _type = "Task" [ 1010.331194] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1010.339720] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Task: {'id': task-3470376, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1010.771549] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1010.771918] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating directory with path [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1010.772043] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6acf5f57-a73c-4d08-85d0-78507d016a20 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.783160] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created directory with path [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1010.783385] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Fetch image to [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1010.783565] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1010.784303] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-59b04689-0fae-4794-b975-bc990ff0d824 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.790872] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5456a355-0f31-4dbd-8021-8acbb23e314c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.800208] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68b5979f-1544-4071-8348-578239cdb01d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1010.804604] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Updating instance_info_cache with network_info: [{"id": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "address": "fa:16:3e:c8:d0:42", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4d19ac4f-d8", "ovs_interfaceid": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1011.499768] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Updated VIF entry in instance network info cache for port 50e69df8-06bf-4d72-8d04-e3579b49628e. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1011.500100] env[66583]: DEBUG nova.network.neutron [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Updating instance_info_cache with network_info: [{"id": "50e69df8-06bf-4d72-8d04-e3579b49628e", "address": "fa:16:3e:7b:c5:a3", "network": {"id": "e92133d2-2caf-4276-90e3-13ef3ceda14b", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-735612131-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "820b7d141569446ead1901b8442f8184", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ae4e3171-21cd-4094-b6cf-81bf366c75bd", "external-id": "nsx-vlan-transportzone-193", "segmentation_id": 193, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap50e69df8-06", "ovs_interfaceid": "50e69df8-06bf-4d72-8d04-e3579b49628e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1011.504407] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Successfully updated port: 5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1011.507768] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Received event network-vif-plugged-4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1011.507768] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquiring lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1011.507768] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1011.507768] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1011.507768] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] No waiting events found dispatching network-vif-plugged-4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1011.507768] env[66583]: WARNING nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Received unexpected event network-vif-plugged-4d19ac4f-d881-4626-9fa1-b0909f18a52d for instance with vm_state building and task_state spawning. [ 1011.507768] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Received event network-changed-4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1011.507768] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Refreshing instance network info cache due to event network-changed-4d19ac4f-d881-4626-9fa1-b0909f18a52d. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1011.507768] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquiring lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1011.508124] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1011.508282] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance network_info: |[{"id": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "address": "fa:16:3e:c8:d0:42", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4d19ac4f-d8", "ovs_interfaceid": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1011.508948] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-313feb20-d514-4f85-a925-41cbc5d3eb3d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.512817] env[66583]: DEBUG oslo_concurrency.lockutils [req-714ed766-823c-4ef7-a42d-d93c8e68fb9c req-ea8f1f15-744c-447f-8f65-024e0e620d2d service nova] Releasing lock "refresh_cache-08689558-cc57-43c5-b56e-f9785b515717" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1011.513426] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1011.513553] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1011.513689] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1011.516700] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquired lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1011.516700] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Refreshing network info cache for port 4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1011.516700] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c8:d0:42', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac2c9d07-ed01-47a9-88f1-562992bc1076', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4d19ac4f-d881-4626-9fa1-b0909f18a52d', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1011.523016] env[66583]: DEBUG oslo.service.loopingcall [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1011.527860] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1011.528831] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cf7d3328-6bd3-4362-a98d-b9e396c49b8e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.547307] env[66583]: DEBUG oslo_vmware.api [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Task: {'id': task-3470376, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.062154} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1011.548149] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1011.548389] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1011.548808] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1011.548808] env[66583]: INFO nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Took 1.31 seconds to destroy the instance on the hypervisor. [ 1011.550151] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cbdd1236-2ce4-45c0-a98e-023173b97aee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.552131] env[66583]: DEBUG nova.compute.claims [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1011.552306] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1011.552574] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1011.558024] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1011.558024] env[66583]: value = "task-3470377" [ 1011.558024] env[66583]: _type = "Task" [ 1011.558024] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1011.565442] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470377, 'name': CreateVM_Task} progress is 5%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1011.580804] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1011.583636] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1011.629961] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1011.693403] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1011.693628] env[66583]: DEBUG oslo_vmware.rw_handles [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1011.790304] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa7e28f3-79bf-416d-a555-b4c6748a0b11 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.798092] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51066bfa-b96a-4300-944d-058727915e76 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.832470] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc732d39-84ec-46f5-ac1d-0e0168ebf27c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.841261] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bf31e03-0c90-4a7a-9add-b85eac1784a9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1011.864255] env[66583]: DEBUG nova.compute.provider_tree [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1011.871767] env[66583]: DEBUG nova.scheduler.client.report [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1011.885317] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.333s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1011.885940] env[66583]: ERROR nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1011.885940] env[66583]: Faults: ['InvalidArgument'] [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] Traceback (most recent call last): [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self.driver.spawn(context, instance, image_meta, [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self._fetch_image_if_missing(context, vi) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] image_cache(vi, tmp_image_ds_loc) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] vm_util.copy_virtual_disk( [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] session._wait_for_task(vmdk_copy_task) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return self.wait_for_task(task_ref) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return evt.wait() [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] result = hub.switch() [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] return self.greenlet.switch() [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] self.f(*self.args, **self.kw) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] raise exceptions.translate_fault(task_info.error) [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] Faults: ['InvalidArgument'] [ 1011.885940] env[66583]: ERROR nova.compute.manager [instance: e7664037-62b0-4195-b935-eab75d232f5d] [ 1011.886867] env[66583]: DEBUG nova.compute.utils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1011.888367] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Build of instance e7664037-62b0-4195-b935-eab75d232f5d was re-scheduled: A specified parameter was not correct: fileType [ 1011.888367] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1011.888622] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1011.888787] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1011.888944] env[66583]: DEBUG nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1011.889109] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1012.029078] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Updating instance_info_cache with network_info: [{"id": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "address": "fa:16:3e:51:d9:b5", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b5b25bb-74", "ovs_interfaceid": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.042802] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1012.043142] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance network_info: |[{"id": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "address": "fa:16:3e:51:d9:b5", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b5b25bb-74", "ovs_interfaceid": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1012.043521] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:51:d9:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ac2c9d07-ed01-47a9-88f1-562992bc1076', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5b5b25bb-74dc-4cc8-b3b1-f409371258cf', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1012.051186] env[66583]: DEBUG oslo.service.loopingcall [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1012.051621] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1012.051848] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-cd8e0e4e-5951-4e4a-af26-ee6160ad9227 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.076286] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470377, 'name': CreateVM_Task, 'duration_secs': 0.303816} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1012.077431] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1012.077623] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1012.077623] env[66583]: value = "task-3470378" [ 1012.077623] env[66583]: _type = "Task" [ 1012.077623] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.078225] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1012.078389] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1012.078731] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1012.078995] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-346f7835-bcac-439c-95fa-2a13f015654a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.088487] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1012.088487] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52165e5a-86ad-1bf1-3bd2-c3265e412fb6" [ 1012.088487] env[66583]: _type = "Task" [ 1012.088487] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.092395] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470378, 'name': CreateVM_Task} progress is 6%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1012.100125] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52165e5a-86ad-1bf1-3bd2-c3265e412fb6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1012.169257] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Updated VIF entry in instance network info cache for port 4d19ac4f-d881-4626-9fa1-b0909f18a52d. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1012.169560] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Updating instance_info_cache with network_info: [{"id": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "address": "fa:16:3e:c8:d0:42", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4d19ac4f-d8", "ovs_interfaceid": "4d19ac4f-d881-4626-9fa1-b0909f18a52d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.179210] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Releasing lock "refresh_cache-e1873f82-8e24-460a-b5cb-36e3bf06abcb" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1012.179495] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Received event network-vif-plugged-5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1012.179696] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquiring lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1012.179895] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1012.180062] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1012.180220] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] No waiting events found dispatching network-vif-plugged-5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1012.180459] env[66583]: WARNING nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Received unexpected event network-vif-plugged-5b5b25bb-74dc-4cc8-b3b1-f409371258cf for instance with vm_state building and task_state spawning. [ 1012.180620] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Received event network-changed-5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1012.180820] env[66583]: DEBUG nova.compute.manager [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Refreshing instance network info cache due to event network-changed-5b5b25bb-74dc-4cc8-b3b1-f409371258cf. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1012.181117] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquiring lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1012.181298] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Acquired lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1012.181456] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Refreshing network info cache for port 5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1012.589036] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470378, 'name': CreateVM_Task, 'duration_secs': 0.278388} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1012.589209] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1012.590296] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1012.603827] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1012.603827] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1012.603827] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1012.604078] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1012.607302] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1012.607700] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a2296f0c-78f1-4a02-9cb8-21480f88bf07 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1012.615194] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1012.615194] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]523cffdb-1318-fea1-1ae3-d8f8d28a9b52" [ 1012.615194] env[66583]: _type = "Task" [ 1012.615194] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1012.622864] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]523cffdb-1318-fea1-1ae3-d8f8d28a9b52, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1012.646814] env[66583]: DEBUG nova.network.neutron [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.656302] env[66583]: INFO nova.compute.manager [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] [instance: e7664037-62b0-4195-b935-eab75d232f5d] Took 0.77 seconds to deallocate network for instance. [ 1012.713133] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Updated VIF entry in instance network info cache for port 5b5b25bb-74dc-4cc8-b3b1-f409371258cf. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1012.713607] env[66583]: DEBUG nova.network.neutron [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Updating instance_info_cache with network_info: [{"id": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "address": "fa:16:3e:51:d9:b5", "network": {"id": "50ca59f4-a2ac-4bc8-9bff-07e53c5e4ade", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1166809360-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3d88e984e6674da9a33c06705de9d7e2", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ac2c9d07-ed01-47a9-88f1-562992bc1076", "external-id": "nsx-vlan-transportzone-968", "segmentation_id": 968, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5b5b25bb-74", "ovs_interfaceid": "5b5b25bb-74dc-4cc8-b3b1-f409371258cf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1012.727694] env[66583]: DEBUG oslo_concurrency.lockutils [req-5f41004c-4cc4-4743-9745-408f36a2c1a0 req-53f46439-4b76-451d-b138-c27815135194 service nova] Releasing lock "refresh_cache-86690ef8-17b8-4d25-a2a4-54c68c98ac7a" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1012.745056] env[66583]: INFO nova.scheduler.client.report [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Deleted allocations for instance e7664037-62b0-4195-b935-eab75d232f5d [ 1012.760884] env[66583]: DEBUG oslo_concurrency.lockutils [None req-156e8c05-88ec-4ec9-a513-50d162d8ac97 tempest-ServerAddressesTestJSON-1148788319 tempest-ServerAddressesTestJSON-1148788319-project-member] Lock "e7664037-62b0-4195-b935-eab75d232f5d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 190.927s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1013.125489] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1013.125766] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1013.125950] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1042.847693] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.843226] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.863158] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.863324] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1045.863433] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1045.881783] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.881938] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882085] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882212] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882339] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882464] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882585] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882709] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882827] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1045.882944] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1045.883354] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1045.883552] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1046.846833] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1046.846833] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1046.846833] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1047.843100] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1047.845842] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1048.846114] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1048.856731] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.858023] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.858023] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1048.858023] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1048.858617] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e517c1c-4394-4eaa-9a58-93580253c491 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.869191] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58837523-1b7b-4aeb-ad01-315f8c2cc6f9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.885131] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42ffc8d8-a188-4e73-8218-68218cafac4d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.892050] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-390dad3d-f083-406b-afcf-d8e43bc3d63e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1048.922226] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180945MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1048.922417] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1048.922575] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1048.983902] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 504d18e4-8457-431b-b6cb-b26a0c64b14b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984127] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984303] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 68449c86-cda6-46ff-a349-c2072829257e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984443] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 035e8729-c02f-490e-a0e4-b8877b52e75b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984569] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e9136963-e0fc-4344-880b-a21549f2cf23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984692] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89ccce06-2094-4f87-a77b-cad92d351dfa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984811] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.984928] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.985054] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 08689558-cc57-43c5-b56e-f9785b515717 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1048.985246] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1048.985385] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1728MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1049.085646] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01b18b15-a15d-46cd-9116-cf03f777a5d6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.093017] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3edf889e-9a1f-441b-840c-3455a5e01869 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.121856] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5bee4e89-3696-4e2b-9da4-cdca42dec192 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.128531] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-415dd8ba-b614-4e24-87d0-5c7589afdd9c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1049.140948] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1049.148931] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1049.163119] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1049.163297] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1057.273219] env[66583]: WARNING oslo_vmware.rw_handles [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1057.273219] env[66583]: ERROR oslo_vmware.rw_handles [ 1057.274058] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1057.275463] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1057.275718] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Copying Virtual Disk [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/82b5ffca-c6b7-4be2-a5bd-06a2f6ae2e64/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1057.276027] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-52a9f2c8-93c6-4935-806d-31f5c98c9ec4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.284510] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 1057.284510] env[66583]: value = "task-3470379" [ 1057.284510] env[66583]: _type = "Task" [ 1057.284510] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.292294] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': task-3470379, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.795048] env[66583]: DEBUG oslo_vmware.exceptions [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1057.795222] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1057.795862] env[66583]: ERROR nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.795862] env[66583]: Faults: ['InvalidArgument'] [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Traceback (most recent call last): [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] yield resources [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self.driver.spawn(context, instance, image_meta, [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self._fetch_image_if_missing(context, vi) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] image_cache(vi, tmp_image_ds_loc) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] vm_util.copy_virtual_disk( [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] session._wait_for_task(vmdk_copy_task) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return self.wait_for_task(task_ref) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return evt.wait() [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] result = hub.switch() [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return self.greenlet.switch() [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self.f(*self.args, **self.kw) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] raise exceptions.translate_fault(task_info.error) [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Faults: ['InvalidArgument'] [ 1057.795862] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] [ 1057.796929] env[66583]: INFO nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Terminating instance [ 1057.797769] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1057.798058] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1057.798272] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78066584-c541-4874-876a-6d0c102c6041 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.800401] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1057.800666] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1057.801381] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-522c65d8-b459-4470-a802-5925186d27c9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.808356] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1057.808592] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4895c521-07df-4c86-b561-5d897f075412 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.810756] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1057.810927] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1057.811857] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f998f0ab-a675-4a85-b0d4-ce1a36941f10 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.817042] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for the task: (returnval){ [ 1057.817042] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52230859-7a81-a0fa-e78f-01b905d57495" [ 1057.817042] env[66583]: _type = "Task" [ 1057.817042] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.823932] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52230859-7a81-a0fa-e78f-01b905d57495, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1057.933077] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1057.933312] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1057.933491] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Deleting the datastore file [datastore1] 504d18e4-8457-431b-b6cb-b26a0c64b14b {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1057.933807] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-65406b71-129e-4f04-afb2-d2475c5a91cd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1057.940351] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 1057.940351] env[66583]: value = "task-3470381" [ 1057.940351] env[66583]: _type = "Task" [ 1057.940351] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1057.947954] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': task-3470381, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1058.326681] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1058.327090] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Creating directory with path [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1058.327228] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1c2c46f6-2ff2-47d8-a1b5-fe8357143b29 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.339479] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Created directory with path [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1058.339670] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Fetch image to [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1058.339838] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1058.340547] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45bd666c-e2b4-4def-a667-4b88a4495666 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.347294] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-102e0353-048b-483e-992f-edc56c399099 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.356286] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37d4798c-08c3-4947-a7e0-d00c10111227 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.386902] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6be2158-798c-45f1-b1cb-d1c9da729d83 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.392300] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-601f83ce-8ccb-4367-bc0f-613138d3c887 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.412668] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1058.450102] env[66583]: DEBUG oslo_vmware.api [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': task-3470381, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.08574} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1058.450349] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1058.450531] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1058.450696] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1058.450869] env[66583]: INFO nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Took 0.65 seconds to destroy the instance on the hypervisor. [ 1058.452957] env[66583]: DEBUG nova.compute.claims [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1058.453141] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1058.453353] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1058.457945] env[66583]: DEBUG oslo_vmware.rw_handles [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1058.515999] env[66583]: DEBUG oslo_vmware.rw_handles [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1058.516211] env[66583]: DEBUG oslo_vmware.rw_handles [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1058.624794] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99dd1bd5-14b2-4c7b-b188-4938f231275a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.632456] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57afdcff-a6fa-44eb-b16c-cff3a1758327 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.661639] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dc4e245-4564-4cd2-b269-20bcb3d217ad {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.668917] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49d9c098-63b3-4d76-a326-aeecafaaf4d6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1058.681735] env[66583]: DEBUG nova.compute.provider_tree [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1058.690384] env[66583]: DEBUG nova.scheduler.client.report [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1058.703056] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.250s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1058.703576] env[66583]: ERROR nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1058.703576] env[66583]: Faults: ['InvalidArgument'] [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Traceback (most recent call last): [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self.driver.spawn(context, instance, image_meta, [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self._fetch_image_if_missing(context, vi) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] image_cache(vi, tmp_image_ds_loc) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] vm_util.copy_virtual_disk( [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] session._wait_for_task(vmdk_copy_task) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return self.wait_for_task(task_ref) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return evt.wait() [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] result = hub.switch() [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] return self.greenlet.switch() [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] self.f(*self.args, **self.kw) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] raise exceptions.translate_fault(task_info.error) [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Faults: ['InvalidArgument'] [ 1058.703576] env[66583]: ERROR nova.compute.manager [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] [ 1058.704347] env[66583]: DEBUG nova.compute.utils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1058.705669] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Build of instance 504d18e4-8457-431b-b6cb-b26a0c64b14b was re-scheduled: A specified parameter was not correct: fileType [ 1058.705669] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1058.706091] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1058.706272] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1058.706447] env[66583]: DEBUG nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1058.706629] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1059.478797] env[66583]: DEBUG nova.network.neutron [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1059.488660] env[66583]: INFO nova.compute.manager [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: 504d18e4-8457-431b-b6cb-b26a0c64b14b] Took 0.78 seconds to deallocate network for instance. [ 1059.574816] env[66583]: INFO nova.scheduler.client.report [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Deleted allocations for instance 504d18e4-8457-431b-b6cb-b26a0c64b14b [ 1059.595432] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b45b32f9-347e-481f-ab35-3e603e084190 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "504d18e4-8457-431b-b6cb-b26a0c64b14b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 195.763s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1062.618697] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "c29638e8-98fd-4de7-8628-932b19087ecd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1062.619442] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "c29638e8-98fd-4de7-8628-932b19087ecd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1062.629959] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Starting instance... {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1062.684675] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1062.684937] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1062.686431] env[66583]: INFO nova.compute.claims [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1062.886018] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd211fcc-45ae-426d-9a48-d2955f790338 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.893708] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40c3e0e8-a343-4369-8ca2-50ed45c67b27 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.927365] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a56d2243-f740-42ce-bea8-dc3515ede1fe {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.937250] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45f6402f-a847-4481-a238-75adf245c680 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1062.949702] env[66583]: DEBUG nova.compute.provider_tree [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1062.959416] env[66583]: DEBUG nova.scheduler.client.report [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1062.977262] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.292s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1062.978016] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Start building networks asynchronously for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1063.015461] env[66583]: DEBUG nova.compute.utils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Using /dev/sd instead of None {{(pid=66583) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1063.016913] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Allocating IP information in the background. {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1063.019283] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] allocate_for_instance() {{(pid=66583) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1063.026570] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Start building block device mappings for instance. {{(pid=66583) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1063.081042] env[66583]: DEBUG nova.policy [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f899691878e549e59f3e0e1ebe8ad2a8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '09706dc60f2148b5a1b340af34b11f0d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=66583) authorize /opt/stack/nova/nova/policy.py:203}} [ 1063.091255] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Start spawning the instance on the hypervisor. {{(pid=66583) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1063.113076] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-17T04:21:49Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-17T04:21:32Z,direct_url=,disk_format='vmdk',id=2a5e619b-8532-4b3c-9d86-85994a7987af,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='41f302a7ddc84085a05c55c0788e6a8e',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-17T04:21:32Z,virtual_size=,visibility=), allow threads: False {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1063.113349] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Flavor limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1063.113510] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Image limits 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1063.113696] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Flavor pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1063.113846] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Image pref 0:0:0 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1063.113996] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=66583) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1063.114406] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1063.114621] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1063.115526] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Got 1 possible topologies {{(pid=66583) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1063.115747] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1063.116041] env[66583]: DEBUG nova.virt.hardware [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=66583) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1063.116914] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4fe3bc2-8f5f-4503-98fa-f2590b74817f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1063.125599] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f62c005d-5e1d-4871-987a-84e17bad2497 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1063.467461] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Successfully created port: 11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1063.896604] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Successfully created port: 36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1064.378052] env[66583]: DEBUG nova.compute.manager [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-vif-plugged-11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1064.378319] env[66583]: DEBUG oslo_concurrency.lockutils [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] Acquiring lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1064.378535] env[66583]: DEBUG oslo_concurrency.lockutils [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] Lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1064.378705] env[66583]: DEBUG oslo_concurrency.lockutils [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] Lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1064.378871] env[66583]: DEBUG nova.compute.manager [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] No waiting events found dispatching network-vif-plugged-11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1064.379128] env[66583]: WARNING nova.compute.manager [req-78b7c5ba-ab4f-4763-92fc-a4dcc4cb4a1c req-fb986167-6030-4b38-abc8-b08bba678364 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received unexpected event network-vif-plugged-11efbf9d-f286-42b4-a482-9917f9905d94 for instance with vm_state building and task_state spawning. [ 1064.457939] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Successfully updated port: 11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1066.405977] env[66583]: DEBUG nova.compute.manager [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-changed-11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1066.406216] env[66583]: DEBUG nova.compute.manager [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Refreshing instance network info cache due to event network-changed-11efbf9d-f286-42b4-a482-9917f9905d94. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1066.406425] env[66583]: DEBUG oslo_concurrency.lockutils [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] Acquiring lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1066.406568] env[66583]: DEBUG oslo_concurrency.lockutils [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] Acquired lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1066.406751] env[66583]: DEBUG nova.network.neutron [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Refreshing network info cache for port 11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1066.440089] env[66583]: DEBUG nova.network.neutron [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1066.596401] env[66583]: DEBUG nova.network.neutron [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1066.605351] env[66583]: DEBUG oslo_concurrency.lockutils [req-a1d6d85d-cd64-4c55-b22f-6d4f6e39833e req-f208288b-f968-4c60-a6c8-4be387d64753 service nova] Releasing lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1067.988789] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Successfully updated port: 36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1068.001321] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1068.001460] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1068.001631] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1068.036467] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1068.758683] env[66583]: DEBUG nova.compute.manager [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-vif-plugged-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1068.758909] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Acquiring lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1068.759509] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1068.759687] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Lock "c29638e8-98fd-4de7-8628-932b19087ecd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1068.759856] env[66583]: DEBUG nova.compute.manager [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] No waiting events found dispatching network-vif-plugged-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1068.760031] env[66583]: WARNING nova.compute.manager [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received unexpected event network-vif-plugged-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 for instance with vm_state building and task_state spawning. [ 1068.760201] env[66583]: DEBUG nova.compute.manager [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-changed-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1068.760355] env[66583]: DEBUG nova.compute.manager [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Refreshing instance network info cache due to event network-changed-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2. {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11009}} [ 1068.760522] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Acquiring lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1068.835866] env[66583]: DEBUG nova.network.neutron [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Updating instance_info_cache with network_info: [{"id": "11efbf9d-f286-42b4-a482-9917f9905d94", "address": "fa:16:3e:34:3e:4b", "network": {"id": "b8477d5b-9e87-4c83-8f38-306bf22abb2a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1856144987", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65497291-07f3-434c-bd42-657a0cb03365", "external-id": "nsx-vlan-transportzone-279", "segmentation_id": 279, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11efbf9d-f2", "ovs_interfaceid": "11efbf9d-f286-42b4-a482-9917f9905d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "address": "fa:16:3e:11:36:c4", "network": {"id": "3a91be16-00b4-4e41-b806-c38af1853b82", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-825860812", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36d4ac78-bf", "ovs_interfaceid": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1068.848854] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Releasing lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1068.849205] env[66583]: DEBUG nova.compute.manager [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Instance network_info: |[{"id": "11efbf9d-f286-42b4-a482-9917f9905d94", "address": "fa:16:3e:34:3e:4b", "network": {"id": "b8477d5b-9e87-4c83-8f38-306bf22abb2a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1856144987", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65497291-07f3-434c-bd42-657a0cb03365", "external-id": "nsx-vlan-transportzone-279", "segmentation_id": 279, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11efbf9d-f2", "ovs_interfaceid": "11efbf9d-f286-42b4-a482-9917f9905d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "address": "fa:16:3e:11:36:c4", "network": {"id": "3a91be16-00b4-4e41-b806-c38af1853b82", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-825860812", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36d4ac78-bf", "ovs_interfaceid": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=66583) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1068.849487] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Acquired lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1068.849663] env[66583]: DEBUG nova.network.neutron [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Refreshing network info cache for port 36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1068.853968] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:3e:4b', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '65497291-07f3-434c-bd42-657a0cb03365', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '11efbf9d-f286-42b4-a482-9917f9905d94', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:11:36:c4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a071ecf4-e713-4f97-9271-8c17952f6dee', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2', 'vif_model': 'vmxnet3'}] {{(pid=66583) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1068.860418] env[66583]: DEBUG oslo.service.loopingcall [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1068.863367] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Creating VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1068.864130] env[66583]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dda4d176-9ce3-4084-888b-8eef0de26f1b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1068.886935] env[66583]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1068.886935] env[66583]: value = "task-3470382" [ 1068.886935] env[66583]: _type = "Task" [ 1068.886935] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1068.894741] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470382, 'name': CreateVM_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1069.114598] env[66583]: DEBUG nova.network.neutron [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Updated VIF entry in instance network info cache for port 36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2. {{(pid=66583) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1069.115039] env[66583]: DEBUG nova.network.neutron [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Updating instance_info_cache with network_info: [{"id": "11efbf9d-f286-42b4-a482-9917f9905d94", "address": "fa:16:3e:34:3e:4b", "network": {"id": "b8477d5b-9e87-4c83-8f38-306bf22abb2a", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1856144987", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.100", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "65497291-07f3-434c-bd42-657a0cb03365", "external-id": "nsx-vlan-transportzone-279", "segmentation_id": 279, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap11efbf9d-f2", "ovs_interfaceid": "11efbf9d-f286-42b4-a482-9917f9905d94", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "address": "fa:16:3e:11:36:c4", "network": {"id": "3a91be16-00b4-4e41-b806-c38af1853b82", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-825860812", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36d4ac78-bf", "ovs_interfaceid": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1069.125144] env[66583]: DEBUG oslo_concurrency.lockutils [req-77afc998-1034-4038-80d1-063e836d71c8 req-a28de0f6-b5fc-4ff0-b609-90c7da984bed service nova] Releasing lock "refresh_cache-c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1069.397211] env[66583]: DEBUG oslo_vmware.api [-] Task: {'id': task-3470382, 'name': CreateVM_Task, 'duration_secs': 0.306587} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1069.397211] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Created VM on the ESX host {{(pid=66583) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1069.397898] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1069.398078] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1069.398410] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1069.398692] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-67773906-af1f-4f0a-803f-927b95dd4dbd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1069.403175] env[66583]: DEBUG oslo_vmware.api [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 1069.403175] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52df8e2a-76b2-6775-8f93-43eda05d98e7" [ 1069.403175] env[66583]: _type = "Task" [ 1069.403175] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1069.410674] env[66583]: DEBUG oslo_vmware.api [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52df8e2a-76b2-6775-8f93-43eda05d98e7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1069.914119] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1069.914390] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Processing image 2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1069.914604] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1103.163601] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.847587] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1105.847896] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1105.847896] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1105.867723] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.867897] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868045] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868181] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868349] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868434] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868551] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868674] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868998] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1105.868998] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1107.289484] env[66583]: WARNING oslo_vmware.rw_handles [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1107.289484] env[66583]: ERROR oslo_vmware.rw_handles [ 1107.289484] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1107.290800] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1107.291130] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Copying Virtual Disk [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/b4e26eda-e2ed-49ad-8e84-d8fc5f0bea26/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1107.291499] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-50588b48-2385-42e5-8faa-e74bee3401ec {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.300734] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for the task: (returnval){ [ 1107.300734] env[66583]: value = "task-3470383" [ 1107.300734] env[66583]: _type = "Task" [ 1107.300734] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1107.309775] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Task: {'id': task-3470383, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1107.811100] env[66583]: DEBUG oslo_vmware.exceptions [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1107.811356] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1107.811964] env[66583]: ERROR nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.811964] env[66583]: Faults: ['InvalidArgument'] [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Traceback (most recent call last): [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] yield resources [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self.driver.spawn(context, instance, image_meta, [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self._fetch_image_if_missing(context, vi) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] image_cache(vi, tmp_image_ds_loc) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] vm_util.copy_virtual_disk( [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] session._wait_for_task(vmdk_copy_task) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return self.wait_for_task(task_ref) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return evt.wait() [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] result = hub.switch() [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return self.greenlet.switch() [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self.f(*self.args, **self.kw) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] raise exceptions.translate_fault(task_info.error) [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Faults: ['InvalidArgument'] [ 1107.811964] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] [ 1107.813063] env[66583]: INFO nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Terminating instance [ 1107.813965] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1107.815252] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1107.815805] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1107.815956] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquired lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1107.816133] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1107.819806] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1f6acfa2-e1e7-403c-86e2-4746a79e9573 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.830421] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1107.830734] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1107.831849] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a6ea8607-5cbb-4d17-bb1a-56422ead7853 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.843136] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Waiting for the task: (returnval){ [ 1107.843136] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52e73c55-c5f8-47a6-3a6b-82af80c4e9f9" [ 1107.843136] env[66583]: _type = "Task" [ 1107.843136] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1107.846100] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.847257] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1107.849926] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.854683] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1107.854942] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1107.855721] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52e73c55-c5f8-47a6-3a6b-82af80c4e9f9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1107.917985] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1107.929190] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Releasing lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1107.929622] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1107.929832] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1107.930982] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66b40322-e14b-4955-9cd8-afc41e1c121a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.938392] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1107.938823] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dd36ebdb-a09e-48d7-9a0d-b2fbb9341b31 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.975233] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1107.975518] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1107.975709] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Deleting the datastore file [datastore1] 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1107.975935] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7e18381-8ed4-45a5-86d8-bc3c0c953506 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1107.983114] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for the task: (returnval){ [ 1107.983114] env[66583]: value = "task-3470385" [ 1107.983114] env[66583]: _type = "Task" [ 1107.983114] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1107.991165] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Task: {'id': task-3470385, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1108.350129] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1108.350524] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Creating directory with path [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1108.350650] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-39f3cb62-14d1-4905-a22b-1b5bc52c4f8d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.362932] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Created directory with path [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1108.363201] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Fetch image to [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1108.363394] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1108.364163] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d9002b7-9bf6-4128-be92-d3128b43e810 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.371141] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c7b37a-f0ff-4ffc-8342-bdf8dfff5451 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.380262] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c78e491-9c54-4272-8424-a6953b8bf5ac {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.412019] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6647d499-eeb4-4d2c-876a-2bb73aa79497 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.418392] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-becb046e-66a4-48cc-a6e2-91611f6c9a4a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.441071] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1108.488518] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1108.544934] env[66583]: DEBUG oslo_vmware.api [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Task: {'id': task-3470385, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.039343} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1108.546465] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1108.546708] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1108.546894] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1108.547088] env[66583]: INFO nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1108.547372] env[66583]: DEBUG oslo.service.loopingcall [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=66583) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1108.547867] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1108.548047] env[66583]: DEBUG oslo_vmware.rw_handles [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1108.548344] env[66583]: DEBUG nova.compute.manager [-] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Skipping network deallocation for instance since networking was not requested. {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1108.550678] env[66583]: DEBUG nova.compute.claims [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1108.550853] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1108.551146] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1108.709182] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e5063a9-8d04-447d-bd31-5738c8132827 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.716852] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ddb99a1-b7e5-4298-b23c-b1f60730c83f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.761031] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a82e0b6a-f7b0-436c-87b9-f9964b607a56 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.767792] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d8172f9-cc45-42ff-8fd7-5951c45492cd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1108.781729] env[66583]: DEBUG nova.compute.provider_tree [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1108.790054] env[66583]: DEBUG nova.scheduler.client.report [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1108.805020] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.252s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1108.805020] env[66583]: ERROR nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1108.805020] env[66583]: Faults: ['InvalidArgument'] [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Traceback (most recent call last): [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self.driver.spawn(context, instance, image_meta, [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self._fetch_image_if_missing(context, vi) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] image_cache(vi, tmp_image_ds_loc) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] vm_util.copy_virtual_disk( [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] session._wait_for_task(vmdk_copy_task) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return self.wait_for_task(task_ref) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return evt.wait() [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] result = hub.switch() [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] return self.greenlet.switch() [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] self.f(*self.args, **self.kw) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] raise exceptions.translate_fault(task_info.error) [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Faults: ['InvalidArgument'] [ 1108.805020] env[66583]: ERROR nova.compute.manager [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] [ 1108.805020] env[66583]: DEBUG nova.compute.utils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] VimFaultException {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1108.806978] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Build of instance 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5 was re-scheduled: A specified parameter was not correct: fileType [ 1108.806978] env[66583]: Faults: ['InvalidArgument'] {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1108.807414] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1108.808787] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquiring lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1108.808787] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Acquired lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1108.808787] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Building network info cache for instance {{(pid=66583) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1108.832744] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Instance cache missing network info. {{(pid=66583) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1108.852260] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1108.916526] env[66583]: DEBUG nova.network.neutron [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1108.925949] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Releasing lock "refresh_cache-7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1108.926186] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1108.926372] env[66583]: DEBUG nova.compute.manager [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] [instance: 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5] Skipping network deallocation for instance since networking was not requested. {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1109.012321] env[66583]: INFO nova.scheduler.client.report [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Deleted allocations for instance 7f8b60cc-fa88-49ad-8372-bc41ef82f2d5 [ 1109.028409] env[66583]: DEBUG oslo_concurrency.lockutils [None req-176279b8-d166-427c-bfdf-c6020d50e31e tempest-ServerShowV254Test-1028042376 tempest-ServerShowV254Test-1028042376-project-member] Lock "7f8b60cc-fa88-49ad-8372-bc41ef82f2d5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 179.441s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1109.841908] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1109.845502] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1110.847115] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1110.856216] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1110.856472] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1110.856675] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1110.856837] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1110.857988] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06cd1718-2dfb-4a7f-b551-0ab95a7a19b3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1110.866639] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0780eea9-32e1-4af5-9d5d-0950ee9632ac {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1110.880387] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae69c2cf-1640-4967-859b-4f3550e16e10 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1110.886540] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4683ea72-ac40-499a-bcb5-b26edebc2bf6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1110.914980] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180956MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1110.915161] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1110.915336] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 68449c86-cda6-46ff-a349-c2072829257e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 035e8729-c02f-490e-a0e4-b8877b52e75b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e9136963-e0fc-4344-880b-a21549f2cf23 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 89ccce06-2094-4f87-a77b-cad92d351dfa actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973022] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance 08689558-cc57-43c5-b56e-f9785b515717 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973374] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance c29638e8-98fd-4de7-8628-932b19087ecd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1110.973374] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1110.973429] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1600MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1111.069711] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-541b63e9-2583-481c-906f-5e3c3f7864df {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1111.076947] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b75db217-15b5-433d-8711-2f8b89992b97 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1111.105839] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1073a84f-6df1-42c0-8e9f-585752f0a0d3 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1111.113058] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81e18e0d-14c4-4444-8474-3862ef361cbd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1111.126096] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1111.133860] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1111.146328] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1111.146496] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1126.437181] env[66583]: DEBUG nova.compute.manager [req-0f0e84c7-5799-474c-9980-a319559f4846 req-bac27e9c-4c52-48d7-95dc-423b8542342e service nova] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Received event network-vif-deleted-6709862d-6896-4eaf-a065-2e2174f26dbd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1128.464965] env[66583]: DEBUG nova.compute.manager [req-cbb5d0b9-6979-4fbd-90f5-8374993b675a req-9ee005d0-87e4-4deb-b926-d8f2999b0c26 service nova] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Received event network-vif-deleted-1598c54f-8bc9-4eca-94b4-f0bceefb9fd3 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1128.465267] env[66583]: DEBUG nova.compute.manager [req-cbb5d0b9-6979-4fbd-90f5-8374993b675a req-9ee005d0-87e4-4deb-b926-d8f2999b0c26 service nova] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Received event network-vif-deleted-06baa48e-0c25-4b8b-9381-d08a4a23a21b {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1130.502034] env[66583]: DEBUG nova.compute.manager [req-1ff4798a-5288-4479-9a6b-52ebba56ea65 req-d309ac3e-5849-4388-9e8a-4b5ba0c6c081 service nova] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Received event network-vif-deleted-5b5b25bb-74dc-4cc8-b3b1-f409371258cf {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1130.502271] env[66583]: DEBUG nova.compute.manager [req-1ff4798a-5288-4479-9a6b-52ebba56ea65 req-d309ac3e-5849-4388-9e8a-4b5ba0c6c081 service nova] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Received event network-vif-deleted-4d19ac4f-d881-4626-9fa1-b0909f18a52d {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1130.502345] env[66583]: DEBUG nova.compute.manager [req-1ff4798a-5288-4479-9a6b-52ebba56ea65 req-d309ac3e-5849-4388-9e8a-4b5ba0c6c081 service nova] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Received event network-vif-deleted-2e991e84-4f4a-4046-903b-070d9f4fc9bd {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1153.571466] env[66583]: DEBUG nova.compute.manager [req-07415fdf-0c75-4351-93d8-74b55d8ff3ef req-3201e7b7-bb2c-4eea-8858-dc4f35c4e333 service nova] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Received event network-vif-deleted-50e69df8-06bf-4d72-8d04-e3579b49628e {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1154.818120] env[66583]: WARNING oslo_vmware.rw_handles [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1154.818120] env[66583]: ERROR oslo_vmware.rw_handles [ 1154.818754] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1154.820687] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1154.820943] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Copying Virtual Disk [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/7607d15e-62ca-4c38-9d39-cf28d7bb5c76/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1154.821272] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0b7d289c-2957-4078-a678-366a55d491c7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1154.829778] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Waiting for the task: (returnval){ [ 1154.829778] env[66583]: value = "task-3470386" [ 1154.829778] env[66583]: _type = "Task" [ 1154.829778] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1154.837835] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Task: {'id': task-3470386, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1155.339799] env[66583]: DEBUG oslo_vmware.exceptions [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1155.340101] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1155.340639] env[66583]: ERROR nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1155.340639] env[66583]: Faults: ['InvalidArgument'] [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] Traceback (most recent call last): [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] yield resources [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] self.driver.spawn(context, instance, image_meta, [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] self._fetch_image_if_missing(context, vi) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] image_cache(vi, tmp_image_ds_loc) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] vm_util.copy_virtual_disk( [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] session._wait_for_task(vmdk_copy_task) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] return self.wait_for_task(task_ref) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] return evt.wait() [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] result = hub.switch() [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] return self.greenlet.switch() [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] self.f(*self.args, **self.kw) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] raise exceptions.translate_fault(task_info.error) [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] Faults: ['InvalidArgument'] [ 1155.340639] env[66583]: ERROR nova.compute.manager [instance: 68449c86-cda6-46ff-a349-c2072829257e] [ 1155.341640] env[66583]: INFO nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Terminating instance [ 1155.342552] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1155.342759] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1155.343397] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1155.343660] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1155.343780] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c19e62c6-c4fc-4ab5-a728-60eeb38ec832 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.346048] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d2f73957-568c-44f8-9802-11b9f818467e {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.352777] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1155.353025] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cb121824-9f1f-45cd-8d02-fd44f9d5e02f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.355212] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1155.355379] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1155.356335] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-64c9c757-0854-4f98-ad91-e3d09821f0df {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.360923] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1155.360923] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52c20a12-4de9-73a6-e293-85f009ebbe0d" [ 1155.360923] env[66583]: _type = "Task" [ 1155.360923] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1155.369440] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52c20a12-4de9-73a6-e293-85f009ebbe0d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1155.417257] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1155.417560] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1155.417745] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Deleting the datastore file [datastore1] 68449c86-cda6-46ff-a349-c2072829257e {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1155.418015] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-951b70d7-8a39-4613-b462-005d91c06838 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.424337] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Waiting for the task: (returnval){ [ 1155.424337] env[66583]: value = "task-3470388" [ 1155.424337] env[66583]: _type = "Task" [ 1155.424337] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1155.432348] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Task: {'id': task-3470388, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1155.872751] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1155.873024] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1155.873220] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-43034ee7-9911-49ac-8dea-326732fa34e2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.890853] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1155.891063] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Fetch image to [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1155.891269] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1155.892032] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea42b0fa-514c-404b-9cce-3a69d25f24b5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.899027] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ffbbfb7-330b-4f1c-8629-e00460c34c8d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.907929] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3b5445c-1544-4003-8eec-9a3195ea6278 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.941858] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf5e3a13-b9f3-4dcb-8408-76943dc7f7b5 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.950136] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b4a9c9de-dd2c-49c1-93cd-04fdd6782f83 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1155.951844] env[66583]: DEBUG oslo_vmware.api [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Task: {'id': task-3470388, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07765} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1155.952099] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1155.952278] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1155.952448] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1155.952634] env[66583]: INFO nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1155.954644] env[66583]: DEBUG nova.compute.claims [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1155.954829] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1155.955055] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1155.973433] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1155.979911] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1155.980583] env[66583]: DEBUG nova.compute.utils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance 68449c86-cda6-46ff-a349-c2072829257e could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1155.982153] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1155.982324] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1155.982492] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1155.982659] env[66583]: DEBUG nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1155.982828] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1156.009015] env[66583]: DEBUG nova.network.neutron [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1156.018652] env[66583]: INFO nova.compute.manager [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Took 0.04 seconds to deallocate network for instance. [ 1156.021814] env[66583]: DEBUG oslo_vmware.rw_handles [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1156.080583] env[66583]: DEBUG oslo_vmware.rw_handles [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1156.080769] env[66583]: DEBUG oslo_vmware.rw_handles [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1156.096843] env[66583]: DEBUG oslo_concurrency.lockutils [None req-8434f75e-71d0-4004-96c9-23e991faa524 tempest-InstanceActionsNegativeTestJSON-1992528863 tempest-InstanceActionsNegativeTestJSON-1992528863-project-member] Lock "68449c86-cda6-46ff-a349-c2072829257e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.417s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1160.847713] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1160.848060] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Cleaning up deleted instances {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11101}} [ 1160.885008] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] There are 15 instances to clean {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11110}} [ 1160.885429] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1160.921122] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1160.942530] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1160.964078] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1160.987278] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.011077] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.058898] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 68449c86-cda6-46ff-a349-c2072829257e] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.080119] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 87acbe03-624d-454c-b108-0566ca0d750e] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.104560] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: a14582eb-f78f-44d6-8c82-16976c0cec5b] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.126482] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 63244459-f37b-4fdb-8afc-9e4a80156099] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.151145] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 83ac0082-b7fe-408d-9d5a-6e614ae7e61a] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.171206] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 9915557d-4251-44a2-bf59-3dd542dfb527] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.192047] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 6deed686-ceca-45a1-b8e4-2461b2e3f039] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.213171] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 12bc9e29-ecea-40e9-af34-a067f3d2301f] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.234921] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: 4fde404c-9011-4e1a-8b3c-8c89e5e45c00] Instance has had 0 of 5 cleanup attempts {{(pid=66583) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11114}} [ 1161.846308] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1163.853580] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1165.847831] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1165.848129] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1165.848169] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1165.857917] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1165.858083] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1167.847134] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1167.847134] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1167.847134] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1167.847134] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Cleaning up deleted instances with incomplete migration {{(pid=66583) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11139}} [ 1168.851122] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1168.862651] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1168.862849] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1168.862996] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1169.846841] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1170.842950] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1170.845522] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1170.854418] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1170.854628] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1170.854809] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1170.854962] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1170.856096] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb327d0a-6637-44e9-bfbf-a90c5cda8a86 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.864728] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6259cf4e-d0e5-4ac4-8cca-fe01e6b2d61a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.878978] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f799899-950a-4b23-972a-03f19ccbaa1b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.884875] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae7f6763-e161-47d0-87d4-dce7d75b13c7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.913176] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180934MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1170.913338] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1170.913534] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1171.032108] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance c29638e8-98fd-4de7-8628-932b19087ecd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1171.032346] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1171.032527] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1171.047242] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing inventories for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1171.059313] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Updating ProviderTree inventory for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1171.059504] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Updating inventory in ProviderTree for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1171.068947] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing aggregate associations for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc, aggregates: None {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1171.083938] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Refreshing trait associations for resource provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc, traits: COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_VMDK {{(pid=66583) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1171.107647] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62d4317e-36e8-4928-81bb-b4ece4637a26 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.114797] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-336e964f-1471-48b4-848c-d2d81c88b808 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.143351] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e635b3a2-bacf-4868-b484-2dd13d2f82de {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.150068] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c22f24d3-c5e4-4190-a7de-3d78d3349f31 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.163708] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1171.172087] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1171.186168] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1171.186351] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.273s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.452751] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._sync_power_states {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.463176] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Getting list of instances from cluster (obj){ [ 1192.463176] env[66583]: value = "domain-c8" [ 1192.463176] env[66583]: _type = "ClusterComputeResource" [ 1192.463176] env[66583]: } {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1192.464198] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edc74b00-c912-475b-b94f-9973b268eb20 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.478668] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Got total of 7 instances {{(pid=66583) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1192.478843] env[66583]: WARNING nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] While synchronizing instance power states, found 1 instances in the database and 7 instances on the hypervisor. [ 1192.479007] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Triggering sync for uuid c29638e8-98fd-4de7-8628-932b19087ecd {{(pid=66583) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10224}} [ 1192.479335] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "c29638e8-98fd-4de7-8628-932b19087ecd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1202.319489] env[66583]: WARNING oslo_vmware.rw_handles [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1202.319489] env[66583]: ERROR oslo_vmware.rw_handles [ 1202.320504] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1202.321896] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1202.322153] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Copying Virtual Disk [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/5351f522-a6ed-4160-8c03-d65ec120f07a/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1202.322443] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-de456498-56aa-4550-ab3b-ce488a0f57a2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.330921] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1202.330921] env[66583]: value = "task-3470389" [ 1202.330921] env[66583]: _type = "Task" [ 1202.330921] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1202.338745] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470389, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1202.842062] env[66583]: DEBUG oslo_vmware.exceptions [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1202.842330] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1202.842889] env[66583]: ERROR nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1202.842889] env[66583]: Faults: ['InvalidArgument'] [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Traceback (most recent call last): [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] yield resources [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] self.driver.spawn(context, instance, image_meta, [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] self._fetch_image_if_missing(context, vi) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] image_cache(vi, tmp_image_ds_loc) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] vm_util.copy_virtual_disk( [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] session._wait_for_task(vmdk_copy_task) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] return self.wait_for_task(task_ref) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] return evt.wait() [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] result = hub.switch() [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] return self.greenlet.switch() [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] self.f(*self.args, **self.kw) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] raise exceptions.translate_fault(task_info.error) [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Faults: ['InvalidArgument'] [ 1202.842889] env[66583]: ERROR nova.compute.manager [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] [ 1202.844019] env[66583]: INFO nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Terminating instance [ 1202.844808] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1202.845024] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1202.845677] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1202.845874] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1202.846097] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b21c640-6a88-4546-8218-05ebf7a7d04a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.848292] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0118424e-63ba-4518-9cfe-22a2132787c4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.855270] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1202.856117] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-39766d8d-b4e5-41cf-8a44-a4c179806b0f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.857430] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1202.857601] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1202.858299] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f0f770dd-fea6-4687-b3bc-959e1191b348 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.863344] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Waiting for the task: (returnval){ [ 1202.863344] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]52a42881-2122-7fbd-393e-b3cdc4db7cb7" [ 1202.863344] env[66583]: _type = "Task" [ 1202.863344] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1202.874551] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]52a42881-2122-7fbd-393e-b3cdc4db7cb7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1202.928164] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1202.928360] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1202.928505] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleting the datastore file [datastore1] 89ccce06-2094-4f87-a77b-cad92d351dfa {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1202.928815] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c16820af-b235-44ff-a273-7067845d27fc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1202.934951] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1202.934951] env[66583]: value = "task-3470391" [ 1202.934951] env[66583]: _type = "Task" [ 1202.934951] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1202.942965] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470391, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1203.374464] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1203.374801] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Creating directory with path [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1203.374939] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-72ea098d-ec1c-4c65-bc6a-3ef6ce583769 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.385374] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Created directory with path [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1203.385564] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Fetch image to [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1203.385717] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1203.386434] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-933ee7e7-6156-4f1f-b092-7c02db1a3fdc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.392702] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81c9a3e6-073a-4972-b3ba-6e19d707b36a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.401290] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-692a8d1a-ed36-45f9-86eb-87357d1b42ad {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.431088] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d91db84e-6876-4357-b26f-847319d7979a {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.438984] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bb402f80-da25-4d5e-a2eb-8510a9d15d7c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1203.444831] env[66583]: DEBUG oslo_vmware.api [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470391, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063028} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1203.445065] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1203.445252] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1203.445423] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1203.445598] env[66583]: INFO nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1203.447713] env[66583]: DEBUG nova.compute.claims [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1203.447845] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1203.448104] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1203.459973] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1203.472729] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1203.473537] env[66583]: DEBUG nova.compute.utils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance 89ccce06-2094-4f87-a77b-cad92d351dfa could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1203.476509] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1203.476679] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1203.476841] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1203.477015] env[66583]: DEBUG nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1203.477186] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1203.501907] env[66583]: DEBUG nova.network.neutron [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Updating instance_info_cache with network_info: [] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1203.508695] env[66583]: DEBUG oslo_vmware.rw_handles [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1203.562728] env[66583]: INFO nova.compute.manager [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 89ccce06-2094-4f87-a77b-cad92d351dfa] Took 0.09 seconds to deallocate network for instance. [ 1203.567321] env[66583]: DEBUG oslo_vmware.rw_handles [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1203.567321] env[66583]: DEBUG oslo_vmware.rw_handles [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1203.606768] env[66583]: DEBUG oslo_concurrency.lockutils [None req-f34335a9-7a8d-4236-ba63-69b4a9894eb3 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "89ccce06-2094-4f87-a77b-cad92d351dfa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 271.284s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1225.873014] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1226.846168] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1226.846365] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Starting heal instance info cache {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9814}} [ 1226.846427] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Rebuilding the list of instances to heal {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9818}} [ 1226.856651] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Skipping network cache update for instance because it is Building. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9827}} [ 1226.856787] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Didn't find any instances for network info cache update. {{(pid=66583) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9900}} [ 1227.846598] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.846732] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.847170] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1228.847170] env[66583]: DEBUG nova.compute.manager [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=66583) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10433}} [ 1229.847179] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.846608] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.846851] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager.update_available_resource {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1230.856705] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1230.857037] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1230.857087] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1230.857250] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=66583) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1230.858396] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb59ea8c-633d-46b0-a79a-a47bb6cf7074 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1230.867239] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e8b04f-30b3-4d9e-b006-107c3a33114d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1230.880771] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c6aa3d1-20f4-49ef-a2cf-39c1ab127921 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1230.887238] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf057563-82dc-4839-9f73-2ed60f46c351 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1230.915739] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180952MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=66583) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1230.915880] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1230.916087] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1230.953105] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Instance c29638e8-98fd-4de7-8628-932b19087ecd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=66583) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1230.953324] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1230.953478] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=66583) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1230.978449] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1fa030d-7728-4dfb-855c-2e3dd109209f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1230.985450] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2a4d67f-b9d5-44ce-8c14-58fcc7b57541 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.016217] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73a3399b-3cbe-4eb6-adfc-7e7c70411dd8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.022894] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aba3802a-1290-4e58-a28d-a4eeaaf1d991 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1231.037684] env[66583]: DEBUG nova.compute.provider_tree [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed in ProviderTree for provider: 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc {{(pid=66583) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1231.045364] env[66583]: DEBUG nova.scheduler.client.report [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Inventory has not changed for provider 19ca8ba5-bd08-4664-b5ea-7bb8423a24bc based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=66583) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1231.059317] env[66583]: DEBUG nova.compute.resource_tracker [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=66583) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1231.059527] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1232.054504] env[66583]: DEBUG oslo_service.periodic_task [None req-7455e8c0-d0a7-46a8-bfd0-488093645bf3 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=66583) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1249.797849] env[66583]: WARNING oslo_vmware.rw_handles [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles response.begin() [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1249.797849] env[66583]: ERROR oslo_vmware.rw_handles [ 1249.798610] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Downloaded image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1249.800383] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Caching image {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1249.800701] env[66583]: DEBUG nova.virt.vmwareapi.vm_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Copying Virtual Disk [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk to [datastore1] vmware_temp/72ec50c1-479f-492a-bc59-7ddd1bc9dfd9/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk {{(pid=66583) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1249.801017] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ceaa4151-1830-4556-b5a9-888aab08796f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1249.808396] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Waiting for the task: (returnval){ [ 1249.808396] env[66583]: value = "task-3470392" [ 1249.808396] env[66583]: _type = "Task" [ 1249.808396] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1249.816352] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Task: {'id': task-3470392, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1250.319070] env[66583]: DEBUG oslo_vmware.exceptions [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Fault InvalidArgument not matched. {{(pid=66583) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1250.319283] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1250.319830] env[66583]: ERROR nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1250.319830] env[66583]: Faults: ['InvalidArgument'] [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] yield resources [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.driver.spawn(context, instance, image_meta, [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._fetch_image_if_missing(context, vi) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] image_cache(vi, tmp_image_ds_loc) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] vm_util.copy_virtual_disk( [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] session._wait_for_task(vmdk_copy_task) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.wait_for_task(task_ref) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return evt.wait() [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] result = hub.switch() [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.greenlet.switch() [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.f(*self.args, **self.kw) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise exceptions.translate_fault(task_info.error) [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Faults: ['InvalidArgument'] [ 1250.319830] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1250.320891] env[66583]: INFO nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Terminating instance [ 1250.321731] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1250.321955] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1250.322192] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-70177694-fa76-481b-a7c7-28f73f0428c0 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.324271] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1250.324475] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1250.325171] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54191831-2501-4cbe-975e-2cdce0d1944f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.331527] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1250.331727] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cdb0e612-88e0-4f92-96e8-d180423f139c {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.333802] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1250.333978] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1250.334890] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-47409b2e-09a5-4ef3-ade3-f7d26bff7507 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.339182] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 1250.339182] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]520623d0-5f99-a9e0-cdc4-c38a6340b124" [ 1250.339182] env[66583]: _type = "Task" [ 1250.339182] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1250.345818] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]520623d0-5f99-a9e0-cdc4-c38a6340b124, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1250.396573] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1250.396797] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1250.397079] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Deleting the datastore file [datastore1] 035e8729-c02f-490e-a0e4-b8877b52e75b {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1250.397371] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3c4e6028-ce0e-4b92-a611-ca8f35a35528 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.403443] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Waiting for the task: (returnval){ [ 1250.403443] env[66583]: value = "task-3470394" [ 1250.403443] env[66583]: _type = "Task" [ 1250.403443] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1250.412368] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Task: {'id': task-3470394, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1250.850010] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1250.850429] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Creating directory with path [datastore1] vmware_temp/79edd8f7-01c4-4697-a6da-b7eb59de8c3c/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1250.850512] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4dfcaabb-b56d-480d-ab75-f4fe4116216d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.861576] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Created directory with path [datastore1] vmware_temp/79edd8f7-01c4-4697-a6da-b7eb59de8c3c/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1250.861808] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Fetch image to [datastore1] vmware_temp/79edd8f7-01c4-4697-a6da-b7eb59de8c3c/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1250.861935] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/79edd8f7-01c4-4697-a6da-b7eb59de8c3c/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1250.862647] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b285d211-d220-4d33-9307-0fec4ad3fd39 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.868967] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6932ada1-8370-4363-ade1-3dee96caff59 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.877790] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a839442a-2958-4643-a36b-f6129e2ded42 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.910108] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb44bd9-d93c-4bc7-874b-401c89587bdb {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.916948] env[66583]: DEBUG oslo_vmware.api [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Task: {'id': task-3470394, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073339} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1250.918291] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1250.918486] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1250.918686] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1250.918861] env[66583]: INFO nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1250.920590] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-03b551ec-83cc-4bd1-9a33-231c405e113d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1250.922452] env[66583]: DEBUG nova.compute.claims [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1250.922623] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1250.922836] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1250.945107] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1250.948412] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1250.948775] env[66583]: DEBUG nova.compute.utils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance 035e8729-c02f-490e-a0e4-b8877b52e75b could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1250.950433] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1250.950604] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1250.950771] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1250.950938] env[66583]: DEBUG nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1250.951119] env[66583]: DEBUG nova.network.neutron [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1251.006946] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1251.007852] env[66583]: ERROR nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] result = getattr(controller, method)(*args, **kwargs) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._get(image_id) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] resp, body = self.http_client.get(url, headers=header) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.request(url, 'GET', **kwargs) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._handle_response(resp) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exc.from_response(resp, resp.content) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] During handling of the above exception, another exception occurred: [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] yield resources [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.driver.spawn(context, instance, image_meta, [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._fetch_image_if_missing(context, vi) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image_fetch(context, vi, tmp_image_ds_loc) [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] images.fetch_image( [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1251.007852] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] metadata = IMAGE_API.get(context, image_ref) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return session.show(context, image_id, [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] _reraise_translated_image_exception(image_id) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise new_exc.with_traceback(exc_trace) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] result = getattr(controller, method)(*args, **kwargs) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._get(image_id) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] resp, body = self.http_client.get(url, headers=header) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.request(url, 'GET', **kwargs) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._handle_response(resp) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exc.from_response(resp, resp.content) [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.009142] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.009142] env[66583]: INFO nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Terminating instance [ 1251.010270] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1251.010270] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1251.010746] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1251.010935] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1251.012611] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e56ac7db-9389-41f4-9993-f934490f0021 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.014912] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3ff1ed-8ced-4272-bf53-b87465594fec {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.022443] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1251.022653] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ac9b764e-34db-4b0b-8b7b-f0c9a2187e04 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.024702] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1251.024868] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1251.025784] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c3aece3b-ae4d-47c2-99fe-63784c172253 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.030356] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Waiting for the task: (returnval){ [ 1251.030356] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]526726f3-e099-eb29-2f8f-c171dfc1a13c" [ 1251.030356] env[66583]: _type = "Task" [ 1251.030356] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1251.037591] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]526726f3-e099-eb29-2f8f-c171dfc1a13c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1251.060551] env[66583]: DEBUG neutronclient.v2_0.client [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1251.061965] env[66583]: ERROR nova.compute.manager [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.driver.spawn(context, instance, image_meta, [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._fetch_image_if_missing(context, vi) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] image_cache(vi, tmp_image_ds_loc) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] vm_util.copy_virtual_disk( [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] session._wait_for_task(vmdk_copy_task) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.wait_for_task(task_ref) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return evt.wait() [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] result = hub.switch() [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.greenlet.switch() [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.f(*self.args, **self.kw) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise exceptions.translate_fault(task_info.error) [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Faults: ['InvalidArgument'] [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] During handling of the above exception, another exception occurred: [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._build_and_run_instance(context, instance, image, [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] with excutils.save_and_reraise_exception(): [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.force_reraise() [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1251.061965] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise self.value [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] with self.rt.instance_claim(context, instance, node, allocs, [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.abort() [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return f(*args, **kwargs) [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._unset_instance_host_and_node(instance) [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] instance.save() [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] updates, result = self.indirection_api.object_action( [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return cctxt.call(context, 'object_action', objinst=objinst, [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] result = self.transport._send( [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self._driver.send(target, ctxt, message, [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise result [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] nova.exception_Remote.InstanceNotFound_Remote: Instance 035e8729-c02f-490e-a0e4-b8877b52e75b could not be found. [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return getattr(target, method)(*args, **kwargs) [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return fn(self, *args, **kwargs) [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] old_ref, inst_ref = db.instance_update_and_get_original( [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return f(*args, **kwargs) [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1251.063093] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] with excutils.save_and_reraise_exception() as ectxt: [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.force_reraise() [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise self.value [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return f(*args, **kwargs) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return f(context, *args, **kwargs) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise exception.InstanceNotFound(instance_id=uuid) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] nova.exception.InstanceNotFound: Instance 035e8729-c02f-490e-a0e4-b8877b52e75b could not be found. [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] During handling of the above exception, another exception occurred: [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] exception_handler_v20(status_code, error_body) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise client_exc(message=error_message, [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Neutron server returns request_ids: ['req-28b3576a-b7ec-4b67-aa77-b74de4973375'] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] During handling of the above exception, another exception occurred: [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] Traceback (most recent call last): [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._deallocate_network(context, instance, requested_networks) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self.network_api.deallocate_for_instance( [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] data = neutron.list_ports(**search_opts) [ 1251.065409] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.list('ports', self.ports_path, retrieve_all, [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] for r in self._pagination(collection, path, **params): [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] res = self.get(path, params=params) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.retry_request("GET", action, body=body, [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] return self.do_request(method, action, body=body, [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] ret = obj(*args, **kwargs) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] self._handle_fault_response(status_code, replybody, resp) [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] raise exception.Unauthorized() [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] nova.exception.Unauthorized: Not authorized. [ 1251.066962] env[66583]: ERROR nova.compute.manager [instance: 035e8729-c02f-490e-a0e4-b8877b52e75b] [ 1251.085961] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1251.086217] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1251.086410] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Deleting the datastore file [datastore1] 08689558-cc57-43c5-b56e-f9785b515717 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1251.087441] env[66583]: DEBUG oslo_concurrency.lockutils [None req-87d64194-4d1e-497a-a329-53bf5918eead tempest-ServerPasswordTestJSON-2079852061 tempest-ServerPasswordTestJSON-2079852061-project-member] Lock "035e8729-c02f-490e-a0e4-b8877b52e75b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 320.824s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1251.087635] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3a07231e-8998-42ac-b94b-b642fe27a05b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.094585] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Waiting for the task: (returnval){ [ 1251.094585] env[66583]: value = "task-3470396" [ 1251.094585] env[66583]: _type = "Task" [ 1251.094585] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1251.103955] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': task-3470396, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1251.540151] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1251.540443] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Creating directory with path [datastore1] vmware_temp/251e7c85-154c-4171-864b-8c9bf66e56cd/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1251.540614] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-85777189-eeba-4dbc-be7a-038031676b6d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.551121] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Created directory with path [datastore1] vmware_temp/251e7c85-154c-4171-864b-8c9bf66e56cd/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1251.551316] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Fetch image to [datastore1] vmware_temp/251e7c85-154c-4171-864b-8c9bf66e56cd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1251.551469] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/251e7c85-154c-4171-864b-8c9bf66e56cd/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1251.552179] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-657019e3-4103-4b15-a991-af0802524fb4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.558324] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4067246-2cd4-4dcc-bf03-d1b99b304515 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.566783] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3381d33c-0cc7-4930-a1f8-806c81555d21 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.599929] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7e647b5-8ea7-4257-9d77-1c64c35a7d9f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.606580] env[66583]: DEBUG oslo_vmware.api [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Task: {'id': task-3470396, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074483} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1251.607985] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1251.608191] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1251.608366] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1251.608582] env[66583]: INFO nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1251.610640] env[66583]: DEBUG nova.compute.claims [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1251.610813] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1251.611034] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1251.614133] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8621908f-504f-41e8-8946-6c0e54ca4e11 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.633269] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1251.637040] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1251.637154] env[66583]: DEBUG nova.compute.utils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance 08689558-cc57-43c5-b56e-f9785b515717 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1251.638829] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1251.638997] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1251.639173] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1251.639343] env[66583]: DEBUG nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1251.639508] env[66583]: DEBUG nova.network.neutron [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1251.665572] env[66583]: DEBUG neutronclient.v2_0.client [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1251.667139] env[66583]: ERROR nova.compute.manager [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] [instance: 08689558-cc57-43c5-b56e-f9785b515717] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] result = getattr(controller, method)(*args, **kwargs) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._get(image_id) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] resp, body = self.http_client.get(url, headers=header) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.request(url, 'GET', **kwargs) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._handle_response(resp) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exc.from_response(resp, resp.content) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] During handling of the above exception, another exception occurred: [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.driver.spawn(context, instance, image_meta, [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._fetch_image_if_missing(context, vi) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image_fetch(context, vi, tmp_image_ds_loc) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] images.fetch_image( [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] metadata = IMAGE_API.get(context, image_ref) [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return session.show(context, image_id, [ 1251.667139] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] _reraise_translated_image_exception(image_id) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise new_exc.with_traceback(exc_trace) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] result = getattr(controller, method)(*args, **kwargs) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._get(image_id) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] resp, body = self.http_client.get(url, headers=header) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.request(url, 'GET', **kwargs) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._handle_response(resp) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exc.from_response(resp, resp.content) [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] During handling of the above exception, another exception occurred: [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._build_and_run_instance(context, instance, image, [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] with excutils.save_and_reraise_exception(): [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.force_reraise() [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise self.value [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] with self.rt.instance_claim(context, instance, node, allocs, [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.abort() [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1251.668560] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return f(*args, **kwargs) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._unset_instance_host_and_node(instance) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] instance.save() [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] updates, result = self.indirection_api.object_action( [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return cctxt.call(context, 'object_action', objinst=objinst, [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] result = self.transport._send( [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._driver.send(target, ctxt, message, [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise result [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] nova.exception_Remote.InstanceNotFound_Remote: Instance 08689558-cc57-43c5-b56e-f9785b515717 could not be found. [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return getattr(target, method)(*args, **kwargs) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return fn(self, *args, **kwargs) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] old_ref, inst_ref = db.instance_update_and_get_original( [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return f(*args, **kwargs) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] with excutils.save_and_reraise_exception() as ectxt: [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.force_reraise() [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise self.value [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return f(*args, **kwargs) [ 1251.669719] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return f(context, *args, **kwargs) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exception.InstanceNotFound(instance_id=uuid) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] nova.exception.InstanceNotFound: Instance 08689558-cc57-43c5-b56e-f9785b515717 could not be found. [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] During handling of the above exception, another exception occurred: [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] exception_handler_v20(status_code, error_body) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise client_exc(message=error_message, [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Neutron server returns request_ids: ['req-0a93bb98-7394-483e-a486-1644a753d13a'] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] During handling of the above exception, another exception occurred: [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] Traceback (most recent call last): [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._deallocate_network(context, instance, requested_networks) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self.network_api.deallocate_for_instance( [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] data = neutron.list_ports(**search_opts) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.list('ports', self.ports_path, retrieve_all, [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1251.671404] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] for r in self._pagination(collection, path, **params): [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] res = self.get(path, params=params) [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.retry_request("GET", action, body=body, [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] return self.do_request(method, action, body=body, [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] ret = obj(*args, **kwargs) [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] self._handle_fault_response(status_code, replybody, resp) [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] raise exception.Unauthorized() [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] nova.exception.Unauthorized: Not authorized. [ 1251.672936] env[66583]: ERROR nova.compute.manager [instance: 08689558-cc57-43c5-b56e-f9785b515717] [ 1251.688259] env[66583]: DEBUG oslo_concurrency.lockutils [None req-b74b1d4c-5222-40a9-9337-a72476fd4c42 tempest-SecurityGroupsTestJSON-1014683064 tempest-SecurityGroupsTestJSON-1014683064-project-member] Lock "08689558-cc57-43c5-b56e-f9785b515717" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 294.052s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1251.730653] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1251.731791] env[66583]: ERROR nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] result = getattr(controller, method)(*args, **kwargs) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._get(image_id) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] resp, body = self.http_client.get(url, headers=header) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.request(url, 'GET', **kwargs) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._handle_response(resp) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exc.from_response(resp, resp.content) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] During handling of the above exception, another exception occurred: [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] yield resources [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.driver.spawn(context, instance, image_meta, [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._fetch_image_if_missing(context, vi) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image_fetch(context, vi, tmp_image_ds_loc) [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] images.fetch_image( [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1251.731791] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] metadata = IMAGE_API.get(context, image_ref) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return session.show(context, image_id, [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] _reraise_translated_image_exception(image_id) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise new_exc.with_traceback(exc_trace) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] result = getattr(controller, method)(*args, **kwargs) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._get(image_id) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] resp, body = self.http_client.get(url, headers=header) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.request(url, 'GET', **kwargs) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._handle_response(resp) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exc.from_response(resp, resp.content) [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.732811] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1251.732811] env[66583]: INFO nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Terminating instance [ 1251.734256] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1251.734531] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1251.734833] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-88cd27ae-eb60-4a98-90d4-cd73ec0de8a2 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.738965] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1251.739174] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1251.739946] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cd0fb3c-b693-42d6-9116-db2153ad0472 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.746980] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1251.747215] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3e107d3e-2c64-4ccd-b9db-07b4e58a4278 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.749413] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1251.749664] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1251.750831] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d1a9e507-b9ad-4117-8df7-b60a7b2b1ce4 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.757148] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1251.757148] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]529503ce-28e7-8535-94be-5ac090a1ffc7" [ 1251.757148] env[66583]: _type = "Task" [ 1251.757148] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1251.770929] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1251.771224] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] vmware_temp/8aedef35-61f7-4ea6-97a0-67dcca6ad386/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1251.771513] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e84a9aa7-a302-4986-aebd-b4f88f422e67 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.790787] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] vmware_temp/8aedef35-61f7-4ea6-97a0-67dcca6ad386/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1251.791107] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Fetch image to [datastore1] vmware_temp/8aedef35-61f7-4ea6-97a0-67dcca6ad386/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1251.791107] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/8aedef35-61f7-4ea6-97a0-67dcca6ad386/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1251.791888] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33059b6c-d333-4108-81e5-07b0830af633 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.798950] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0de03a2-9465-461a-82e6-e0e8fbedebdd {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.801792] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1251.801985] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1251.802178] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Deleting the datastore file [datastore1] e9136963-e0fc-4344-880b-a21549f2cf23 {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1251.802682] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c6386485-2205-4606-bf18-0175eb835262 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.810348] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3384c28-4822-4176-8c9b-bea9439cc5c7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.813851] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Waiting for the task: (returnval){ [ 1251.813851] env[66583]: value = "task-3470398" [ 1251.813851] env[66583]: _type = "Task" [ 1251.813851] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1251.843019] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b1f872d-712e-4664-ab0d-9974c622bd51 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.848563] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Task: {'id': task-3470398, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1251.851886] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e51637ec-f72d-437e-b6ae-fbcb821ed739 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.873606] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1251.966183] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1251.966973] env[66583]: ERROR nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] result = getattr(controller, method)(*args, **kwargs) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._get(image_id) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] resp, body = self.http_client.get(url, headers=header) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.request(url, 'GET', **kwargs) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._handle_response(resp) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exc.from_response(resp, resp.content) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] During handling of the above exception, another exception occurred: [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] yield resources [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.driver.spawn(context, instance, image_meta, [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._fetch_image_if_missing(context, vi) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image_fetch(context, vi, tmp_image_ds_loc) [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] images.fetch_image( [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1251.966973] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] metadata = IMAGE_API.get(context, image_ref) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return session.show(context, image_id, [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] _reraise_translated_image_exception(image_id) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise new_exc.with_traceback(exc_trace) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] result = getattr(controller, method)(*args, **kwargs) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._get(image_id) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] resp, body = self.http_client.get(url, headers=header) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.request(url, 'GET', **kwargs) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._handle_response(resp) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exc.from_response(resp, resp.content) [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1251.968734] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1251.968734] env[66583]: INFO nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Terminating instance [ 1251.969849] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1251.969849] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1251.970144] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1251.970342] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1251.971138] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b777c07-a9df-4405-94dd-99d45b44f75b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.974145] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9e33fed6-7510-4120-b742-c33ef252c2ee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.980152] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1251.980358] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e3d1f1c0-b8ab-46f0-8069-c61bfc12b452 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.982492] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1251.982662] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1251.983576] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-83ae8935-de79-45ba-8bce-3c559c320704 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1251.987988] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1251.987988] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]523bcd58-cb8d-d4f1-fb6f-ea9dc7bf575a" [ 1251.987988] env[66583]: _type = "Task" [ 1251.987988] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1251.998210] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]523bcd58-cb8d-d4f1-fb6f-ea9dc7bf575a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1252.046724] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1252.046947] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1252.047094] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleting the datastore file [datastore1] e1873f82-8e24-460a-b5cb-36e3bf06abcb {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1252.047345] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b366cf78-3214-4255-b968-427fa9da7d90 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.053214] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1252.053214] env[66583]: value = "task-3470400" [ 1252.053214] env[66583]: _type = "Task" [ 1252.053214] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1252.061519] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470400, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1252.323352] env[66583]: DEBUG oslo_vmware.api [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Task: {'id': task-3470398, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064923} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1252.323563] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1252.323722] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1252.323895] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1252.324080] env[66583]: INFO nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Took 0.58 seconds to destroy the instance on the hypervisor. [ 1252.326345] env[66583]: DEBUG nova.compute.claims [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1252.326539] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1252.326762] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1252.350663] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1252.351338] env[66583]: DEBUG nova.compute.utils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance e9136963-e0fc-4344-880b-a21549f2cf23 could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1252.352962] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1252.353143] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1252.353315] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1252.353478] env[66583]: DEBUG nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1252.353639] env[66583]: DEBUG nova.network.neutron [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1252.443777] env[66583]: DEBUG neutronclient.v2_0.client [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1252.445455] env[66583]: ERROR nova.compute.manager [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] result = getattr(controller, method)(*args, **kwargs) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._get(image_id) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] resp, body = self.http_client.get(url, headers=header) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.request(url, 'GET', **kwargs) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._handle_response(resp) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exc.from_response(resp, resp.content) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] During handling of the above exception, another exception occurred: [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.driver.spawn(context, instance, image_meta, [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._fetch_image_if_missing(context, vi) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image_fetch(context, vi, tmp_image_ds_loc) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] images.fetch_image( [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] metadata = IMAGE_API.get(context, image_ref) [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return session.show(context, image_id, [ 1252.445455] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] _reraise_translated_image_exception(image_id) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise new_exc.with_traceback(exc_trace) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] result = getattr(controller, method)(*args, **kwargs) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._get(image_id) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] resp, body = self.http_client.get(url, headers=header) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.request(url, 'GET', **kwargs) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._handle_response(resp) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exc.from_response(resp, resp.content) [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] During handling of the above exception, another exception occurred: [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._build_and_run_instance(context, instance, image, [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] with excutils.save_and_reraise_exception(): [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.force_reraise() [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise self.value [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] with self.rt.instance_claim(context, instance, node, allocs, [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.abort() [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1252.447604] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return f(*args, **kwargs) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._unset_instance_host_and_node(instance) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] instance.save() [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] updates, result = self.indirection_api.object_action( [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return cctxt.call(context, 'object_action', objinst=objinst, [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] result = self.transport._send( [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._driver.send(target, ctxt, message, [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise result [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] nova.exception_Remote.InstanceNotFound_Remote: Instance e9136963-e0fc-4344-880b-a21549f2cf23 could not be found. [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return getattr(target, method)(*args, **kwargs) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return fn(self, *args, **kwargs) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] old_ref, inst_ref = db.instance_update_and_get_original( [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return f(*args, **kwargs) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] with excutils.save_and_reraise_exception() as ectxt: [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.force_reraise() [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise self.value [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return f(*args, **kwargs) [ 1252.449250] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return f(context, *args, **kwargs) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exception.InstanceNotFound(instance_id=uuid) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] nova.exception.InstanceNotFound: Instance e9136963-e0fc-4344-880b-a21549f2cf23 could not be found. [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] During handling of the above exception, another exception occurred: [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] exception_handler_v20(status_code, error_body) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise client_exc(message=error_message, [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Neutron server returns request_ids: ['req-6844ac0c-e009-47e9-8ded-8c04d3f0d575'] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] During handling of the above exception, another exception occurred: [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] Traceback (most recent call last): [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._deallocate_network(context, instance, requested_networks) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self.network_api.deallocate_for_instance( [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] data = neutron.list_ports(**search_opts) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.list('ports', self.ports_path, retrieve_all, [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1252.450458] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] for r in self._pagination(collection, path, **params): [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] res = self.get(path, params=params) [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.retry_request("GET", action, body=body, [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] return self.do_request(method, action, body=body, [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] ret = obj(*args, **kwargs) [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] self._handle_fault_response(status_code, replybody, resp) [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] raise exception.Unauthorized() [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] nova.exception.Unauthorized: Not authorized. [ 1252.452135] env[66583]: ERROR nova.compute.manager [instance: e9136963-e0fc-4344-880b-a21549f2cf23] [ 1252.465963] env[66583]: DEBUG oslo_concurrency.lockutils [None req-4ef4cf0a-642d-4f14-bea0-b3ba875e5a86 tempest-ServersTestManualDisk-1051143333 tempest-ServersTestManualDisk-1051143333-project-member] Lock "e9136963-e0fc-4344-880b-a21549f2cf23" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 321.325s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1252.497682] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1252.497918] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Creating directory with path [datastore1] vmware_temp/3bd24a4e-b146-4113-a53e-94527b8d8410/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1252.498151] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0fd04186-b35c-44b9-82f1-beab12ecdc32 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.509184] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Created directory with path [datastore1] vmware_temp/3bd24a4e-b146-4113-a53e-94527b8d8410/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1252.509370] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Fetch image to [datastore1] vmware_temp/3bd24a4e-b146-4113-a53e-94527b8d8410/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1252.509554] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/3bd24a4e-b146-4113-a53e-94527b8d8410/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1252.510238] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05daab39-5a13-4704-b35a-a781770f94e9 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.516147] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62bd9c8b-3131-49c6-b1fd-404bce86372d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.524788] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8698fedb-67e0-4f12-a012-d2f15f874810 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.557092] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc1714b9-49ef-4fda-8e46-b06ae3ea87dc {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.563685] env[66583]: DEBUG oslo_vmware.api [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470400, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063113} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1252.565104] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1252.565299] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1252.565476] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1252.565648] env[66583]: INFO nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1252.567453] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1925137f-7f0f-453b-9ed9-0956a26aee7d {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.569674] env[66583]: DEBUG nova.compute.claims [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1252.569842] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1252.570063] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1252.591886] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1252.595241] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1252.595854] env[66583]: DEBUG nova.compute.utils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1252.597521] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1252.597692] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1252.597854] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1252.598030] env[66583]: DEBUG nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1252.598196] env[66583]: DEBUG nova.network.neutron [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1252.621409] env[66583]: DEBUG neutronclient.v2_0.client [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1252.622862] env[66583]: ERROR nova.compute.manager [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] result = getattr(controller, method)(*args, **kwargs) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._get(image_id) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] resp, body = self.http_client.get(url, headers=header) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.request(url, 'GET', **kwargs) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._handle_response(resp) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exc.from_response(resp, resp.content) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] During handling of the above exception, another exception occurred: [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.driver.spawn(context, instance, image_meta, [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._fetch_image_if_missing(context, vi) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image_fetch(context, vi, tmp_image_ds_loc) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] images.fetch_image( [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] metadata = IMAGE_API.get(context, image_ref) [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return session.show(context, image_id, [ 1252.622862] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] _reraise_translated_image_exception(image_id) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise new_exc.with_traceback(exc_trace) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] result = getattr(controller, method)(*args, **kwargs) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._get(image_id) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] resp, body = self.http_client.get(url, headers=header) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.request(url, 'GET', **kwargs) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._handle_response(resp) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exc.from_response(resp, resp.content) [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] During handling of the above exception, another exception occurred: [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._build_and_run_instance(context, instance, image, [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] with excutils.save_and_reraise_exception(): [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.force_reraise() [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise self.value [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] with self.rt.instance_claim(context, instance, node, allocs, [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.abort() [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1252.623921] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return f(*args, **kwargs) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._unset_instance_host_and_node(instance) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] instance.save() [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] updates, result = self.indirection_api.object_action( [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return cctxt.call(context, 'object_action', objinst=objinst, [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] result = self.transport._send( [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._driver.send(target, ctxt, message, [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise result [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] nova.exception_Remote.InstanceNotFound_Remote: Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb could not be found. [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return getattr(target, method)(*args, **kwargs) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return fn(self, *args, **kwargs) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] old_ref, inst_ref = db.instance_update_and_get_original( [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return f(*args, **kwargs) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] with excutils.save_and_reraise_exception() as ectxt: [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.force_reraise() [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise self.value [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return f(*args, **kwargs) [ 1252.625055] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return f(context, *args, **kwargs) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exception.InstanceNotFound(instance_id=uuid) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] nova.exception.InstanceNotFound: Instance e1873f82-8e24-460a-b5cb-36e3bf06abcb could not be found. [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] During handling of the above exception, another exception occurred: [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] exception_handler_v20(status_code, error_body) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise client_exc(message=error_message, [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Neutron server returns request_ids: ['req-727a11d0-9aef-437e-82a1-734a7d399471'] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] During handling of the above exception, another exception occurred: [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] Traceback (most recent call last): [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._deallocate_network(context, instance, requested_networks) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self.network_api.deallocate_for_instance( [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] data = neutron.list_ports(**search_opts) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.list('ports', self.ports_path, retrieve_all, [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1252.626316] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] for r in self._pagination(collection, path, **params): [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] res = self.get(path, params=params) [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.retry_request("GET", action, body=body, [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] return self.do_request(method, action, body=body, [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] ret = obj(*args, **kwargs) [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] self._handle_fault_response(status_code, replybody, resp) [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] raise exception.Unauthorized() [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] nova.exception.Unauthorized: Not authorized. [ 1252.627538] env[66583]: ERROR nova.compute.manager [instance: e1873f82-8e24-460a-b5cb-36e3bf06abcb] [ 1252.642499] env[66583]: DEBUG oslo_concurrency.lockutils [None req-722edd84-591e-4152-9cf2-c67ec66978f1 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "e1873f82-8e24-460a-b5cb-36e3bf06abcb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 319.975s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1252.682544] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1252.683295] env[66583]: ERROR nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] result = getattr(controller, method)(*args, **kwargs) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._get(image_id) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] resp, body = self.http_client.get(url, headers=header) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.request(url, 'GET', **kwargs) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._handle_response(resp) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exc.from_response(resp, resp.content) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] During handling of the above exception, another exception occurred: [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] yield resources [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.driver.spawn(context, instance, image_meta, [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._fetch_image_if_missing(context, vi) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image_fetch(context, vi, tmp_image_ds_loc) [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] images.fetch_image( [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1252.683295] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] metadata = IMAGE_API.get(context, image_ref) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return session.show(context, image_id, [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] _reraise_translated_image_exception(image_id) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise new_exc.with_traceback(exc_trace) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] result = getattr(controller, method)(*args, **kwargs) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._get(image_id) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] resp, body = self.http_client.get(url, headers=header) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.request(url, 'GET', **kwargs) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._handle_response(resp) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exc.from_response(resp, resp.content) [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1252.684492] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1252.684492] env[66583]: INFO nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Terminating instance [ 1252.686850] env[66583]: DEBUG oslo_concurrency.lockutils [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2a5e619b-8532-4b3c-9d86-85994a7987af/2a5e619b-8532-4b3c-9d86-85994a7987af.vmdk" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1252.687076] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1252.687321] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a8a46485-4c3a-4f02-bc9b-5b25b434475f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.689911] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Start destroying the instance on the hypervisor. {{(pid=66583) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1252.690112] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Destroying instance {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1252.690870] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe8f0f30-446a-42a1-b3f2-783c57549f59 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.697527] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Unregistering the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1252.697727] env[66583]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a12839e0-b490-407d-8b76-e9b21d1e27f1 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.700027] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1252.700202] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=66583) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1252.701112] env[66583]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39ce3d28-eb42-4821-a93f-fe5ac61cb57b {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.705715] env[66583]: DEBUG oslo_vmware.api [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Waiting for the task: (returnval){ [ 1252.705715] env[66583]: value = "session[52863156-ea99-bb12-2b18-8df289cda217]529378bc-ef04-06dc-085e-32080217ed4e" [ 1252.705715] env[66583]: _type = "Task" [ 1252.705715] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1252.717886] env[66583]: DEBUG oslo_vmware.api [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Task: {'id': session[52863156-ea99-bb12-2b18-8df289cda217]529378bc-ef04-06dc-085e-32080217ed4e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1252.753881] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Unregistered the VM {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1252.754088] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Deleting contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1252.754269] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleting the datastore file [datastore1] 86690ef8-17b8-4d25-a2a4-54c68c98ac7a {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1252.754503] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a3f3442b-0fe8-4e6b-9cf9-422865128119 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1252.760481] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Waiting for the task: (returnval){ [ 1252.760481] env[66583]: value = "task-3470402" [ 1252.760481] env[66583]: _type = "Task" [ 1252.760481] env[66583]: } to complete. {{(pid=66583) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1252.767989] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470402, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1253.216154] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Preparing fetch location {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1253.216546] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating directory with path [datastore1] vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1253.216689] env[66583]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-311894e9-6643-48f1-aa47-f82ea07c63e7 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.227274] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Created directory with path [datastore1] vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af {{(pid=66583) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1253.227445] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Fetch image to [datastore1] vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk {{(pid=66583) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1253.227615] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to [datastore1] vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk on the data store datastore1 {{(pid=66583) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1253.228391] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5beb3b0e-3971-4e53-9b19-14113e2a9eee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.234799] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab7ae351-d576-47fc-8b5b-4205979d969f {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.243410] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf10fba2-5e59-4b7e-9859-225a655430d6 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.276357] env[66583]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30b96f73-a41e-4657-80cf-967cf11bcaee {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.284129] env[66583]: DEBUG oslo_vmware.api [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Task: {'id': task-3470402, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068856} completed successfully. {{(pid=66583) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1253.285562] env[66583]: DEBUG nova.virt.vmwareapi.ds_util [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Deleted the datastore file {{(pid=66583) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1253.285750] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Deleted contents of the VM from datastore datastore1 {{(pid=66583) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1253.285923] env[66583]: DEBUG nova.virt.vmwareapi.vmops [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance destroyed {{(pid=66583) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1253.286109] env[66583]: INFO nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1253.287835] env[66583]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-1a5ab0dd-0103-4c0f-be67-a563ff96e0d8 {{(pid=66583) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.290415] env[66583]: DEBUG nova.compute.claims [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Aborting claim: {{(pid=66583) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1253.290587] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1253.290798] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1253.307674] env[66583]: DEBUG nova.virt.vmwareapi.images [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Downloading image file data 2a5e619b-8532-4b3c-9d86-85994a7987af to the data store datastore1 {{(pid=66583) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1253.316382] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1253.317098] env[66583]: DEBUG nova.compute.utils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a could not be found. {{(pid=66583) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1253.318820] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Instance disappeared during build. {{(pid=66583) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1253.319007] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Unplugging VIFs for instance {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1253.319177] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=66583) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1253.319349] env[66583]: DEBUG nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Deallocating network for instance {{(pid=66583) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1253.319508] env[66583]: DEBUG nova.network.neutron [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] deallocate_for_instance() {{(pid=66583) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1253.353970] env[66583]: DEBUG oslo_vmware.rw_handles [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1253.411257] env[66583]: DEBUG oslo_vmware.rw_handles [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Completed reading data from the image iterator. {{(pid=66583) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1253.411627] env[66583]: DEBUG oslo_vmware.rw_handles [None req-7343d5e5-e69f-4562-ac6f-e0a5a0179c58 tempest-ServersTestMultiNic-187228492 tempest-ServersTestMultiNic-187228492-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f6fe5835-4a5b-4902-9972-09a79ba6223f/2a5e619b-8532-4b3c-9d86-85994a7987af/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=66583) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1253.475929] env[66583]: DEBUG neutronclient.v2_0.client [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=66583) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1253.477535] env[66583]: ERROR nova.compute.manager [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] result = getattr(controller, method)(*args, **kwargs) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._get(image_id) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] resp, body = self.http_client.get(url, headers=header) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.request(url, 'GET', **kwargs) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._handle_response(resp) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exc.from_response(resp, resp.content) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] During handling of the above exception, another exception occurred: [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.driver.spawn(context, instance, image_meta, [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._fetch_image_if_missing(context, vi) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image_fetch(context, vi, tmp_image_ds_loc) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] images.fetch_image( [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] metadata = IMAGE_API.get(context, image_ref) [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return session.show(context, image_id, [ 1253.477535] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] _reraise_translated_image_exception(image_id) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise new_exc.with_traceback(exc_trace) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] result = getattr(controller, method)(*args, **kwargs) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._get(image_id) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] resp, body = self.http_client.get(url, headers=header) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.request(url, 'GET', **kwargs) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._handle_response(resp) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exc.from_response(resp, resp.content) [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] nova.exception.ImageNotAuthorized: Not authorized for image 2a5e619b-8532-4b3c-9d86-85994a7987af. [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] During handling of the above exception, another exception occurred: [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._build_and_run_instance(context, instance, image, [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] with excutils.save_and_reraise_exception(): [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.force_reraise() [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise self.value [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] with self.rt.instance_claim(context, instance, node, allocs, [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.abort() [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1253.478608] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return f(*args, **kwargs) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._unset_instance_host_and_node(instance) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] instance.save() [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] updates, result = self.indirection_api.object_action( [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] result = self.transport._send( [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._driver.send(target, ctxt, message, [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise result [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] nova.exception_Remote.InstanceNotFound_Remote: Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a could not be found. [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return getattr(target, method)(*args, **kwargs) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return fn(self, *args, **kwargs) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return f(*args, **kwargs) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] with excutils.save_and_reraise_exception() as ectxt: [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.force_reraise() [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise self.value [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return f(*args, **kwargs) [ 1253.479651] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return f(context, *args, **kwargs) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exception.InstanceNotFound(instance_id=uuid) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] nova.exception.InstanceNotFound: Instance 86690ef8-17b8-4d25-a2a4-54c68c98ac7a could not be found. [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] During handling of the above exception, another exception occurred: [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] exception_handler_v20(status_code, error_body) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise client_exc(message=error_message, [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Neutron server returns request_ids: ['req-0f9a41d2-31dc-435c-a953-71a1fbc2fe13'] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] During handling of the above exception, another exception occurred: [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] Traceback (most recent call last): [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._deallocate_network(context, instance, requested_networks) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self.network_api.deallocate_for_instance( [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] data = neutron.list_ports(**search_opts) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.list('ports', self.ports_path, retrieve_all, [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1253.480898] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] for r in self._pagination(collection, path, **params): [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] res = self.get(path, params=params) [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.retry_request("GET", action, body=body, [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] return self.do_request(method, action, body=body, [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] ret = obj(*args, **kwargs) [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] self._handle_fault_response(status_code, replybody, resp) [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] raise exception.Unauthorized() [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] nova.exception.Unauthorized: Not authorized. [ 1253.482188] env[66583]: ERROR nova.compute.manager [instance: 86690ef8-17b8-4d25-a2a4-54c68c98ac7a] [ 1253.498115] env[66583]: DEBUG oslo_concurrency.lockutils [None req-fcd58d06-bfa3-4e6a-becc-08fe83a53f12 tempest-ListServerFiltersTestJSON-966352621 tempest-ListServerFiltersTestJSON-966352621-project-member] Lock "86690ef8-17b8-4d25-a2a4-54c68c98ac7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 320.484s {{(pid=66583) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1258.137933] env[66583]: DEBUG nova.compute.manager [req-add19de8-8c1e-465e-a6cf-700c86893ee1 req-917a1b7a-bcf9-4cb5-8809-289a171eb4aa service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-vif-deleted-11efbf9d-f286-42b4-a482-9917f9905d94 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}} [ 1258.138256] env[66583]: INFO nova.compute.manager [req-add19de8-8c1e-465e-a6cf-700c86893ee1 req-917a1b7a-bcf9-4cb5-8809-289a171eb4aa service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Neutron deleted interface 11efbf9d-f286-42b4-a482-9917f9905d94; detaching it from the instance and deleting it from the info cache [ 1258.138419] env[66583]: DEBUG nova.network.neutron [req-add19de8-8c1e-465e-a6cf-700c86893ee1 req-917a1b7a-bcf9-4cb5-8809-289a171eb4aa service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Updating instance_info_cache with network_info: [{"id": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "address": "fa:16:3e:11:36:c4", "network": {"id": "3a91be16-00b4-4e41-b806-c38af1853b82", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-825860812", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "09706dc60f2148b5a1b340af34b11f0d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a071ecf4-e713-4f97-9271-8c17952f6dee", "external-id": "nsx-vlan-transportzone-23", "segmentation_id": 23, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap36d4ac78-bf", "ovs_interfaceid": "36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=66583) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1258.150700] env[66583]: DEBUG oslo_concurrency.lockutils [req-add19de8-8c1e-465e-a6cf-700c86893ee1 req-917a1b7a-bcf9-4cb5-8809-289a171eb4aa service nova] Acquiring lock "c29638e8-98fd-4de7-8628-932b19087ecd" {{(pid=66583) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1260.173162] env[66583]: DEBUG nova.compute.manager [req-585dfd1f-c912-4a76-9d51-a94f10334a9a req-20d1c6d5-e6fc-41ba-beb5-578f9a81d7df service nova] [instance: c29638e8-98fd-4de7-8628-932b19087ecd] Received event network-vif-deleted-36d4ac78-bfc1-4a1b-9e9d-d72348af8ce2 {{(pid=66583) external_instance_event /opt/stack/nova/nova/compute/manager.py:11004}}