[ 523.281041] env[59447]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 523.718414] env[59490]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 525.242367] env[59490]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=59490) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 525.242743] env[59490]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=59490) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 525.242807] env[59490]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=59490) initialize /usr/local/lib/python3.10/dist-packages/os_vif/__init__.py:44}} [ 525.243081] env[59490]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 525.244212] env[59490]: WARNING nova.servicegroup.api [-] Report interval must be less than service down time. Current config: . Setting service_down_time to: 300 [ 525.363830] env[59490]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=59490) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:384}} [ 525.374296] env[59490]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.010s {{(pid=59490) execute /usr/local/lib/python3.10/dist-packages/oslo_concurrency/processutils.py:422}} [ 525.471223] env[59490]: INFO nova.virt.driver [None req-269f2d1f-0041-43f9-9383-fe1c00a413ba None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 525.545059] env[59490]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 525.545201] env[59490]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 525.545318] env[59490]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=59490) __init__ /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:242}} [ 528.750439] env[59490]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-9c84e4ae-43c7-492f-bbb6-c62501166ca6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.766023] env[59490]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=59490) _create_session /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:242}} [ 528.766251] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-8392a59f-baf1-474a-a41d-132fe5ce6245 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.794909] env[59490]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 5dc7c. [ 528.795114] env[59490]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 3.250s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.795674] env[59490]: INFO nova.virt.vmwareapi.driver [None req-269f2d1f-0041-43f9-9383-fe1c00a413ba None None] VMware vCenter version: 7.0.3 [ 528.799049] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-593d6e05-a264-4597-ab50-2e2e6dea3b95 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.820539] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a059ecbf-4601-46e1-a7f5-3bb827b4d325 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.827026] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9099fd2-eed5-49ad-9023-e0a1a34728bc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.834250] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b37e249-a3a8-4995-9283-658987d6849f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.847623] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8f471fcf-0754-485f-998e-26cff4aacd43 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.854020] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2699ce-a2d6-4d46-b4ec-0b5df3236261 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.884564] env[59490]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-420d250f-d597-4be9-9c5e-4ed45fafa003 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 528.889980] env[59490]: DEBUG nova.virt.vmwareapi.driver [None req-269f2d1f-0041-43f9-9383-fe1c00a413ba None None] Extension org.openstack.compute already exists. {{(pid=59490) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:214}} [ 528.892612] env[59490]: INFO nova.compute.provider_config [None req-269f2d1f-0041-43f9-9383-fe1c00a413ba None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 528.908184] env[59490]: DEBUG nova.context [None req-269f2d1f-0041-43f9-9383-fe1c00a413ba None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),ea362c59-13b9-4db4-9e8f-1cda2e57ee77(cell1) {{(pid=59490) load_cells /opt/stack/nova/nova/context.py:464}} [ 528.910105] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.910321] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.911039] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.911383] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Acquiring lock "ea362c59-13b9-4db4-9e8f-1cda2e57ee77" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 528.911566] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Lock "ea362c59-13b9-4db4-9e8f-1cda2e57ee77" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 528.912550] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Lock "ea362c59-13b9-4db4-9e8f-1cda2e57ee77" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 528.924586] env[59490]: DEBUG oslo_db.sqlalchemy.engines [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59490) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 528.928582] env[59490]: DEBUG oslo_db.sqlalchemy.engines [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=59490) _check_effective_sql_mode /usr/local/lib/python3.10/dist-packages/oslo_db/sqlalchemy/engines.py:335}} [ 528.931085] env[59490]: ERROR nova.db.main.api [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 528.931085] env[59490]: result = function(*args, **kwargs) [ 528.931085] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 528.931085] env[59490]: return func(*args, **kwargs) [ 528.931085] env[59490]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 528.931085] env[59490]: result = fn(*args, **kwargs) [ 528.931085] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 528.931085] env[59490]: return f(*args, **kwargs) [ 528.931085] env[59490]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 528.931085] env[59490]: return db.service_get_minimum_version(context, binaries) [ 528.931085] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 528.931085] env[59490]: _check_db_access() [ 528.931085] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 528.931085] env[59490]: stacktrace = ''.join(traceback.format_stack()) [ 528.931085] env[59490]: [ 528.933737] env[59490]: ERROR nova.db.main.api [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] No DB access allowed in nova-compute: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 528.933737] env[59490]: result = function(*args, **kwargs) [ 528.933737] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 528.933737] env[59490]: return func(*args, **kwargs) [ 528.933737] env[59490]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 528.933737] env[59490]: result = fn(*args, **kwargs) [ 528.933737] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 528.933737] env[59490]: return f(*args, **kwargs) [ 528.933737] env[59490]: File "/opt/stack/nova/nova/objects/service.py", line 546, in _db_service_get_minimum_version [ 528.933737] env[59490]: return db.service_get_minimum_version(context, binaries) [ 528.933737] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 528.933737] env[59490]: _check_db_access() [ 528.933737] env[59490]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 528.933737] env[59490]: stacktrace = ''.join(traceback.format_stack()) [ 528.933737] env[59490]: [ 528.934368] env[59490]: WARNING nova.objects.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Failed to get minimum service version for cell ea362c59-13b9-4db4-9e8f-1cda2e57ee77 [ 528.934368] env[59490]: WARNING nova.objects.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 528.934661] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Acquiring lock "singleton_lock" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 528.934811] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Acquired lock "singleton_lock" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 528.935056] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Releasing lock "singleton_lock" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 528.935390] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Full set of CONF: {{(pid=59490) _wait_for_exit_or_signal /usr/local/lib/python3.10/dist-packages/oslo_service/service.py:362}} [ 528.935530] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ******************************************************************************** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2589}} [ 528.935655] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] Configuration options gathered from: {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2590}} [ 528.935785] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2591}} [ 528.935975] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2592}} [ 528.936114] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ================================================================================ {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2594}} [ 528.936339] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] allow_resize_to_same_host = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.936512] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] arq_binding_timeout = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.936643] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] backdoor_port = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.936769] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] backdoor_socket = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.936929] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] block_device_allocate_retries = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937106] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] block_device_allocate_retries_interval = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937279] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cert = self.pem {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937475] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937643] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute_monitors = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937812] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] config_dir = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.937978] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] config_drive_format = iso9660 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938125] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938288] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] config_source = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938452] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] console_host = devstack {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938612] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] control_exchange = nova {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938766] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cpu_allocation_ratio = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.938924] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] daemon = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939095] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] debug = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939252] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] default_access_ip_network_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939418] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] default_availability_zone = nova {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939570] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] default_ephemeral_format = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939798] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.939956] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] default_schedule_zone = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940122] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] disk_allocation_ratio = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940282] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] enable_new_services = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940477] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] enabled_apis = ['osapi_compute'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940640] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] enabled_ssl_apis = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940796] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] flat_injected = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.940950] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] force_config_drive = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941115] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] force_raw_images = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941280] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] graceful_shutdown_timeout = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941436] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] heal_instance_info_cache_interval = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941637] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] host = cpu-1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941804] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] initial_cpu_allocation_ratio = 4.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.941960] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] initial_disk_allocation_ratio = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942129] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] initial_ram_allocation_ratio = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942332] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942494] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_build_timeout = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942648] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_delete_interval = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942807] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_format = [instance: %(uuid)s] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.942966] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_name_template = instance-%08x {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.943136] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_usage_audit = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.943302] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_usage_audit_period = month {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.943506] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.943682] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] instances_path = /opt/stack/data/nova/instances {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.943847] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] internal_service_availability_zone = internal {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944007] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] key = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944197] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] live_migration_retry_count = 30 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944399] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_config_append = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944570] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944725] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_dir = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944877] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.944993] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_options = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945164] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_rotate_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945327] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_rotate_interval_type = days {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945490] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] log_rotation_type = none {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945614] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945734] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.945893] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946061] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946190] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946372] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] long_rpc_timeout = 1800 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946547] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_concurrent_builds = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946708] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_concurrent_live_migrations = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.946862] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_concurrent_snapshots = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947025] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_local_block_devices = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947188] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_logfile_count = 30 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947341] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] max_logfile_size_mb = 200 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947497] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] maximum_instance_delete_attempts = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947660] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metadata_listen = 0.0.0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947821] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metadata_listen_port = 8775 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.947983] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metadata_workers = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948153] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] migrate_max_retries = -1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948315] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] mkisofs_cmd = genisoimage {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948514] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] my_block_storage_ip = 10.180.1.21 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948643] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] my_ip = 10.180.1.21 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948801] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] network_allocate_retries = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.948973] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.949151] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] osapi_compute_listen = 0.0.0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.949310] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] osapi_compute_listen_port = 8774 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.949503] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] osapi_compute_unique_server_name_scope = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.949684] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] osapi_compute_workers = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.949843] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] password_length = 12 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950007] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] periodic_enable = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950173] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] periodic_fuzzy_delay = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950337] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] pointer_model = usbtablet {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950499] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] preallocate_images = none {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950652] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] publish_errors = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950776] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] pybasedir = /opt/stack/nova {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.950926] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ram_allocation_ratio = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951097] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rate_limit_burst = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951259] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rate_limit_except_level = CRITICAL {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951411] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rate_limit_interval = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951563] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reboot_timeout = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951713] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reclaim_instance_interval = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.951860] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] record = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952031] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reimage_timeout_per_gb = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952200] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] report_interval = 120 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952358] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rescue_timeout = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952564] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reserved_host_cpus = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952749] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reserved_host_disk_mb = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.952908] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reserved_host_memory_mb = 512 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953078] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] reserved_huge_pages = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953240] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] resize_confirm_window = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953400] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] resize_fs_using_block_device = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953556] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] resume_guests_state_on_host_boot = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953720] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.953879] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rpc_response_timeout = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954044] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] run_external_periodic_tasks = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954219] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] running_deleted_instance_action = reap {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954378] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] running_deleted_instance_poll_interval = 1800 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954531] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] running_deleted_instance_timeout = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954683] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler_instance_sync_interval = 120 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954815] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_down_time = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.954978] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] servicegroup_driver = db {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.955147] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] shelved_offload_time = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.955303] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] shelved_poll_interval = 3600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.955474] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] shutdown_timeout = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.955655] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] source_is_ipv6 = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.955815] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ssl_only = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956062] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956246] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] sync_power_state_interval = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956482] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] sync_power_state_pool_size = 1000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956599] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] syslog_log_facility = LOG_USER {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956753] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] tempdir = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.956909] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] timeout_nbd = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957085] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] transport_url = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957245] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] update_resources_interval = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957403] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_cow_images = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957557] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_eventlog = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957711] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_journal = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.957866] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_json = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958027] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_rootwrap_daemon = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958185] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_stderr = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958338] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] use_syslog = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958492] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vcpu_pin_set = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958687] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plugging_is_fatal = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.958855] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plugging_timeout = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.959028] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] virt_mkfs = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.959191] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] volume_usage_poll_interval = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.959346] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] watch_log_file = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.959511] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] web = /usr/share/spice-html5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2602}} [ 528.959694] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_concurrency.disable_process_locking = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.959987] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.960179] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.960344] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.960510] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_metrics.metrics_process_name = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.960673] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.960832] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961014] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.auth_strategy = keystone {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961177] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.compute_link_prefix = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961344] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961513] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.dhcp_domain = novalocal {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961702] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.enable_instance_password = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.961865] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.glance_link_prefix = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962036] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962207] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.instance_list_cells_batch_strategy = distributed {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962366] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.instance_list_per_project_cells = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962523] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.list_records_by_skipping_down_cells = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962677] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.local_metadata_per_cell = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.962839] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.max_limit = 1000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963007] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.metadata_cache_expiration = 15 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963184] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.neutron_default_tenant_id = default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963347] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.use_forwarded_for = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963509] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.use_neutron_default_nets = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963669] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963826] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_dynamic_failure_fatal = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.963985] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.964181] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_dynamic_ssl_certfile = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.964354] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_dynamic_targets = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.964516] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_jsonfile_path = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.964712] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api.vendordata_providers = ['StaticJSON'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.964901] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.backend = dogpile.cache.memcached {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965076] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.backend_argument = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965242] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.config_prefix = cache.oslo {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965405] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.dead_timeout = 60.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965567] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.debug_cache_backend = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965726] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.enable_retry_client = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.965882] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.enable_socket_keepalive = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966057] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.enabled = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966240] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.expiration_time = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966414] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.hashclient_retry_attempts = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966591] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.hashclient_retry_delay = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966734] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_dead_retry = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.966898] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_password = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967076] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967241] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967424] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_pool_maxsize = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967590] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_pool_unused_timeout = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967749] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_sasl_enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.967926] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_servers = ['localhost:11211'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968099] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_socket_timeout = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968271] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.memcache_username = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968459] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.proxies = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968624] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.retry_attempts = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968787] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.retry_delay = 0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.968947] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.socket_keepalive_count = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969116] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.socket_keepalive_idle = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969274] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.socket_keepalive_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969427] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.tls_allowed_ciphers = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969580] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.tls_cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969731] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.tls_certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.969888] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.tls_enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970051] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cache.tls_keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970220] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970412] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.auth_type = password {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970582] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970751] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.catalog_info = volumev3::publicURL {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.970909] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971083] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971247] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.cross_az_attach = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971409] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.debug = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971564] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.endpoint_template = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971721] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.http_retries = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.971880] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972042] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972215] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.os_region_name = RegionOne {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972378] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972539] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cinder.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972700] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.972852] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.cpu_dedicated_set = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973016] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.cpu_shared_set = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973185] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.image_type_exclude_list = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973346] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.live_migration_wait_for_vif_plug = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973528] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.max_concurrent_disk_ops = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973697] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.max_disk_devices_to_attach = -1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.973855] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974027] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974211] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.resource_provider_association_refresh = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974377] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.shutdown_retry_interval = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974551] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974723] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] conductor.workers = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.974893] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] console.allowed_origins = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975060] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] console.ssl_ciphers = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975227] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] console.ssl_minimum_version = default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975395] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] consoleauth.token_ttl = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975559] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975708] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.975865] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976028] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976192] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976372] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976555] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976711] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.976865] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977033] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977194] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977349] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977511] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.service_type = accelerator {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977668] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977821] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.977973] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978140] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978316] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978473] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] cyborg.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978648] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.backend = sqlalchemy {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978818] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.connection = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.978980] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.connection_debug = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979158] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.connection_parameters = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979318] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.connection_recycle_time = 3600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979503] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.connection_trace = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979671] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.db_inc_retry_interval = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979833] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.db_max_retries = 20 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.979992] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.db_max_retry_interval = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.980163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.db_retry_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.980343] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.max_overflow = 50 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.980522] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.max_pool_size = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.980686] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.max_retries = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.980846] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.mysql_enable_ndb = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981016] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.mysql_sql_mode = TRADITIONAL {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981176] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.mysql_wsrep_sync_wait = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981334] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.pool_timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981498] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.retry_interval = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981651] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.slave_connection = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981812] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.sqlite_synchronous = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.981968] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] database.use_db_reconnect = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.982154] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.backend = sqlalchemy {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.982590] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.connection = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.982781] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.connection_debug = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.982955] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.connection_parameters = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983134] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.connection_recycle_time = 3600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983302] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.connection_trace = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983467] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.db_inc_retry_interval = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983628] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.db_max_retries = 20 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983786] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.db_max_retry_interval = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.983945] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.db_retry_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984140] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.max_overflow = 50 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984317] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.max_pool_size = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984485] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.max_retries = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984649] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.mysql_enable_ndb = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984814] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.984969] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.mysql_wsrep_sync_wait = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.985141] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.pool_timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.985307] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.retry_interval = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.985473] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.slave_connection = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.985656] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] api_database.sqlite_synchronous = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.985834] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] devices.enabled_mdev_types = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986015] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986191] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ephemeral_storage_encryption.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986372] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ephemeral_storage_encryption.key_size = 512 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986544] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.api_servers = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986706] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.986866] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987034] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987196] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987355] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987515] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.debug = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987676] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.default_trusted_certificate_ids = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987835] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.enable_certificate_validation = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.987992] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.enable_rbd_download = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988162] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988327] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988494] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988671] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988832] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.988993] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.num_retries = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989196] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.rbd_ceph_conf = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989332] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.rbd_connect_timeout = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989498] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.rbd_pool = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989661] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.rbd_user = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989818] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.989974] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990150] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.service_type = image {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990312] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990471] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990623] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990775] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.990948] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991121] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.verify_glance_signatures = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991278] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] glance.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991443] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] guestfs.debug = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991632] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.config_drive_cdrom = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991801] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.config_drive_inject_password = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.991962] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992139] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.enable_instance_metrics_collection = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992306] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.enable_remotefx = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992474] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.instances_path_share = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992634] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.iscsi_initiator_list = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992792] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.limit_cpu_features = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.992952] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993122] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993287] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.power_state_check_timeframe = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993446] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.power_state_event_polling_interval = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993609] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993765] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.use_multipath_io = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.993919] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.volume_attach_retry_count = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.994084] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.volume_attach_retry_interval = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.994240] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.vswitch_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.994398] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.994578] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] mks.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.994933] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995130] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.manager_interval = 2400 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995299] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.precache_concurrency = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995466] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.remove_unused_base_images = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995630] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995792] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.995960] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] image_cache.subdirectory_name = _base {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996144] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.api_max_retries = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996330] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.api_retry_interval = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996496] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996657] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.auth_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996813] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.996969] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997142] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997334] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997512] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997673] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997836] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.997993] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998319] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998476] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.partition_key = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998635] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.peer_list = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998787] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.998946] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.serial_console_state_timeout = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999110] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999276] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.service_type = baremetal {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999436] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999589] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999741] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 528.999895] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000081] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000239] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ironic.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000438] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000616] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] key_manager.fixed_key = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000794] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.000954] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.barbican_api_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001123] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.barbican_endpoint = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001292] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.barbican_endpoint_type = public {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001447] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.barbican_region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001599] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001751] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.001908] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002073] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002229] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002387] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.number_of_retries = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002552] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.retry_delay = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002711] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.send_service_user_token = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.002870] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003030] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003191] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.verify_ssl = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003352] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican.verify_ssl_path = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003529] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003687] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.auth_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003841] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.003991] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004321] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004479] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004631] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004811] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] barbican_service_user.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.004979] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.approle_role_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.approle_secret_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006270] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.kv_mountpoint = secret {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006646] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.kv_version = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006646] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.namespace = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006734] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.root_token_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.006830] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007083] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.ssl_ca_crt_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007182] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007346] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.use_ssl = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007517] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007679] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007834] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.007992] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008157] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008311] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008467] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008622] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008772] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.008920] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009079] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009234] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009389] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009581] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.service_type = identity {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009746] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.009903] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010070] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010228] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010403] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010560] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] keystone.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010752] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.connection_uri = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.010911] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_mode = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011086] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_model_extra_flags = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011258] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_models = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011427] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_power_governor_high = performance {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011590] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_power_governor_low = powersave {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011758] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_power_management = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.011924] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012095] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.device_detach_attempts = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012257] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.device_detach_timeout = 20 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012420] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.disk_cachemodes = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012598] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.disk_prefix = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012763] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.enabled_perf_events = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.012923] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.file_backed_memory = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013098] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.gid_maps = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013257] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.hw_disk_discard = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013414] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.hw_machine_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013580] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_rbd_ceph_conf = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013742] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.013904] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014079] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_rbd_glance_store_name = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014242] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_rbd_pool = rbd {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014407] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_type = default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014560] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.images_volume_group = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014719] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.inject_key = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.014877] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.inject_partition = -2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015043] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.inject_password = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015206] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.iscsi_iface = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015367] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.iser_use_multipath = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015544] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_bandwidth = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015712] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_completion_timeout = 800 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.015871] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_downtime = 500 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016039] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_downtime_delay = 75 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016223] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_downtime_steps = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016383] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_inbound_addr = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016546] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_permit_auto_converge = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016702] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_permit_post_copy = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.016864] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_scheme = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017042] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_timeout_action = abort {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017209] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_tunnelled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017362] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_uri = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017520] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.live_migration_with_native_tls = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017675] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.max_queues = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017833] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.mem_stats_period_seconds = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.017984] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.nfs_mount_options = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.018465] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.018465] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_aoe_discover_tries = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.018661] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_iser_scan_tries = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.018837] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_memory_encrypted_guests = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.018999] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_nvme_discover_tries = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.019177] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_pcie_ports = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.019344] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.num_volume_scan_tries = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.019507] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.pmem_namespaces = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.019662] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.quobyte_client_cfg = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.019942] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020145] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rbd_connect_timeout = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020310] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020470] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020626] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rbd_secret_uuid = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020778] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rbd_user = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.020936] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.realtime_scheduler_priority = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021114] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.remote_filesystem_transport = ssh {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021271] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rescue_image_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021425] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rescue_kernel_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021577] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rescue_ramdisk_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021736] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rng_dev_path = /dev/urandom {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.021888] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.rx_queue_size = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.022058] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.smbfs_mount_options = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.022326] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.022492] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.snapshot_compression = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.022645] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.snapshot_image_format = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.022852] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.023051] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.sparse_logical_volumes = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.023231] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.swtpm_enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.023399] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.swtpm_group = tss {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.023561] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.swtpm_user = tss {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.sysinfo_serial = unique {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.tx_queue_size = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.uid_maps = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.use_virtio_for_bridges = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.virt_type = kvm {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.volume_clear = zero {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.volume_clear_size = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.volume_use_multipath = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_cache_path = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_mount_group = qemu {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_mount_opts = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.025981] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.vzstorage_mount_user = stack {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026163] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026394] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026553] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.auth_type = password {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026712] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.026865] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027033] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027201] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027411] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027587] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.default_floating_pool = public {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027764] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.027935] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.extension_sync_interval = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028103] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.http_retries = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028277] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028432] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028584] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028746] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.metadata_proxy_shared_secret = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.028901] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029098] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.ovs_bridge = br-int {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029278] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.physnets = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029444] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.region_name = RegionOne {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029609] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.service_metadata_proxy = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029763] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.029927] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.service_type = network {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030096] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030256] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030410] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030564] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030735] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.030888] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] neutron.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031069] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] notifications.bdms_in_notifications = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031254] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] notifications.default_level = INFO {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031423] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] notifications.notification_format = unversioned {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031580] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] notifications.notify_on_state_change = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031749] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.031919] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] pci.alias = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032124] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] pci.device_spec = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032308] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] pci.report_in_placement = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032479] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032646] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.auth_type = password {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032808] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.auth_url = http://10.180.1.21/identity {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.032963] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033132] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033290] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033448] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033603] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033754] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.default_domain_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.033907] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.default_domain_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034071] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.domain_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034225] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.domain_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034377] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034532] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034681] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034830] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.034979] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.035197] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.password = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.035404] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.project_domain_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.035583] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.project_domain_name = Default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.035749] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.project_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.035916] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.project_name = service {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036094] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.region_name = RegionOne {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036276] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036454] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.service_type = placement {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036613] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036766] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.036921] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037086] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.system_scope = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037267] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037492] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.trust_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037665] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.user_domain_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037831] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.user_domain_name = Default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.037990] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.user_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.038175] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.username = placement {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.038352] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.038513] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] placement.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.038685] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.cores = 20 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.038847] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.count_usage_from_placement = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039022] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039197] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.injected_file_content_bytes = 10240 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039363] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.injected_file_path_length = 255 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039521] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.injected_files = 5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039681] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.instances = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.039841] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.key_pairs = 100 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040006] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.metadata_items = 128 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040182] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.ram = 51200 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040357] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.recheck_quota = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040537] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.server_group_members = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040699] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] quota.server_groups = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.040866] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rdp.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.041203] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.041387] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.041550] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.041711] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.image_metadata_prefilter = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.041870] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042040] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.max_attempts = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042204] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.max_placement_results = 1000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042363] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042521] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.query_placement_for_availability_zone = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042677] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.query_placement_for_image_type_support = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042830] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.042997] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] scheduler.workers = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.043180] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.043438] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.043538] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.043726] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.043864] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.044034] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.044212] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.044436] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.044687] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.host_subset_size = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.044893] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.image_properties_default_architecture = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.045125] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.045325] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.isolated_hosts = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.045536] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.isolated_images = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.045777] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.max_instances_per_host = 50 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046038] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046229] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.pci_in_placement = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046606] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046784] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.046973] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.047158] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.047386] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.047629] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.track_instance_changes = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.047757] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.047993] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metrics.required = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.048221] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metrics.weight_multiplier = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.048416] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metrics.weight_of_unavailable = -10000.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.048622] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] metrics.weight_setting = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.048989] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.049227] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.049442] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.port_range = 10000:20000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.049662] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.049865] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.050380] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] serial_console.serialproxy_port = 6083 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.050380] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.050544] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.auth_type = password {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.050675] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.050881] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.051096] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.051311] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.051502] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.051740] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.send_service_user_token = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.051928] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.052141] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] service_user.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.052383] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.agent_enabled = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.052643] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.052972] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053183] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.html5proxy_host = 0.0.0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053354] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.html5proxy_port = 6082 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053514] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.image_compression = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053670] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.jpeg_compression = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053827] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.playback_compression = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.053994] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.server_listen = 127.0.0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054173] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054329] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.streaming_mode = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054483] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] spice.zlib_compression = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054644] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] upgrade_levels.baseapi = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054811] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] upgrade_levels.cert = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.054957] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] upgrade_levels.compute = auto {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055125] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] upgrade_levels.conductor = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055286] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] upgrade_levels.scheduler = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055452] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055609] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.auth_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055760] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.055913] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056081] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056267] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056434] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056622] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056779] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vendordata_dynamic_auth.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.056996] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.api_retry_count = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.057182] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.ca_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.057463] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.cache_prefix = devstack-image-cache {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.057571] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.cluster_name = testcl1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.057750] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.connection_pool_size = 10 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.057889] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.console_delay_seconds = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058066] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.datastore_regex = ^datastore.* {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058281] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058452] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.host_password = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058613] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.host_port = 443 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058775] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.host_username = administrator@vsphere.local {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.058940] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.insecure = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059110] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.integration_bridge = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059274] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.maximum_objects = 100 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059463] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.pbm_default_policy = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059626] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.pbm_enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059810] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.pbm_wsdl_location = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.059937] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.060178] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.serial_port_proxy_uri = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.060366] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.serial_port_service_uri = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.060538] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.task_poll_interval = 0.5 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.060707] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.use_linked_clone = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.060873] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.vnc_keymap = en-us {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.061047] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.vnc_port = 5900 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.061213] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vmware.vnc_port_total = 10000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.061400] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.auth_schemes = ['none'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.061573] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.061879] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.062073] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.062241] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.novncproxy_port = 6080 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.062417] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.server_listen = 127.0.0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.062585] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.062740] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.vencrypt_ca_certs = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063030] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.vencrypt_client_cert = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063084] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vnc.vencrypt_client_key = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063225] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063384] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_deep_image_inspection = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063540] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_fallback_pcpu_query = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063696] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_group_policy_check_upcall = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.063847] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064009] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.disable_rootwrap = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064171] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.enable_numa_live_migration = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064372] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064541] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064697] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.handle_virt_lifecycle_events = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064850] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.libvirt_disable_apic = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.064998] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.never_download_image_if_on_rbd = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065168] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065325] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065490] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065644] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065798] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.065952] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066118] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066300] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066473] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066669] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066811] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.client_socket_timeout = 900 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.066971] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.default_pool_size = 1000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.067146] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.keep_alive = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.067344] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.max_header_line = 16384 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.067761] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.secure_proxy_ssl_header = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.067761] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.ssl_ca_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.067983] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.ssl_cert_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.068068] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.ssl_key_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.068228] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.tcp_keepidle = 600 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.068399] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.068571] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] zvm.ca_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.068731] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] zvm.cloud_connector_url = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069049] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069222] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] zvm.reachable_timeout = 300 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069399] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.enforce_new_defaults = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069564] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.enforce_scope = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069732] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.policy_default_rule = default {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.069910] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070091] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.policy_file = policy.yaml {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070262] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070420] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070571] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070722] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.070878] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071050] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071225] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071396] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.connection_string = messaging:// {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071559] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.enabled = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071722] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.es_doc_type = notification {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.071880] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.es_scroll_size = 10000 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072051] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.es_scroll_time = 2m {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072211] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.filter_error_trace = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072374] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.hmac_keys = SECRET_KEY {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072535] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.sentinel_service_name = mymaster {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072701] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.socket_timeout = 0.1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.072861] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] profiler.trace_sqlalchemy = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073031] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] remote_debug.host = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073190] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] remote_debug.port = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073363] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073521] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073717] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.073880] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074051] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074212] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074372] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074529] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074683] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074834] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.074997] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075172] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075337] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075499] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075654] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075817] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.075973] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076141] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076346] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076513] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076673] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076834] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.076991] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.077161] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.077344] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.077510] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.077675] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.077837] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078083] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078167] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078333] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_rabbit.ssl_version = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078514] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078674] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_notifications.retry = -1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.078847] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079015] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_messaging_notifications.transport_url = **** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079188] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.auth_section = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079346] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.auth_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079499] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.cafile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079649] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.certfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079807] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.collect_timing = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.079958] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.connect_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080125] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.connect_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080278] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.endpoint_id = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080429] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.endpoint_override = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080584] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.insecure = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080736] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.keyfile = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.080885] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.max_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081045] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.min_version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081199] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.region_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081351] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.service_name = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081501] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.service_type = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081657] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.split_loggers = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081807] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.status_code_retries = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.081957] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.status_code_retry_delay = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082129] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.timeout = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082281] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.valid_interfaces = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082431] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_limit.version = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082587] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_reports.file_event_handler = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082744] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_reports.file_event_handler_interval = 1 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.082893] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] oslo_reports.log_dir = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083063] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083220] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.group = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083401] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083578] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083738] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.083890] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_linux_bridge_privileged.user = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084065] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084238] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.group = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084444] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.helper_command = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084578] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084734] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.084886] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] vif_plug_ovs_privileged.user = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085060] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.flat_interface = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085236] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085403] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085564] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085727] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.085884] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086052] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086227] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_linux_bridge.vlan_interface = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086410] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.isolate_vif = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086571] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086728] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.086889] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087060] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.ovsdb_interface = native {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087234] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_vif_ovs.per_port_bridge = False {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087405] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] os_brick.lock_path = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087567] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.capabilities = [21] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087717] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.group = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.087866] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.helper_command = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088030] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088210] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.thread_pool_size = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088338] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] privsep_osbrick.user = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088502] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088653] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.group = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088801] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.helper_command = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.088959] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.089125] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.thread_pool_size = 8 {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.089275] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] nova_sys_admin.user = None {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2609}} [ 529.089403] env[59490]: DEBUG oslo_service.service [None req-c6dc0bd2-102c-4605-851c-74de4da3bdfd None None] ******************************************************************************** {{(pid=59490) log_opt_values /usr/local/lib/python3.10/dist-packages/oslo_config/cfg.py:2613}} [ 529.089793] env[59490]: INFO nova.service [-] Starting compute node (version 0.1.0) [ 529.099751] env[59490]: INFO nova.virt.node [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Generated node identity 715aacdb-6e76-47b7-ae6f-492abc122a20 [ 529.099841] env[59490]: INFO nova.virt.node [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Wrote node identity 715aacdb-6e76-47b7-ae6f-492abc122a20 to /opt/stack/data/n-cpu-1/compute_id [ 529.110891] env[59490]: WARNING nova.compute.manager [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Compute nodes ['715aacdb-6e76-47b7-ae6f-492abc122a20'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 529.142878] env[59490]: INFO nova.compute.manager [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 529.165011] env[59490]: WARNING nova.compute.manager [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 529.165258] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 529.165469] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 529.165609] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 529.165750] env[59490]: DEBUG nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 529.166917] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca75875a-8ceb-49f7-988f-a05944fa543d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.175727] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3aff369-2db2-4882-ad01-5f11398300f2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.189749] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00f6354e-8bff-44fc-9256-74d8b5efe531 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.196067] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6983697e-6fe6-4e07-8dd6-2b0b63ed02b5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.225018] env[59490]: DEBUG nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181664MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 529.225180] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 529.225353] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 529.236101] env[59490]: WARNING nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] No compute node record for cpu-1:715aacdb-6e76-47b7-ae6f-492abc122a20: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 715aacdb-6e76-47b7-ae6f-492abc122a20 could not be found. [ 529.247691] env[59490]: INFO nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 715aacdb-6e76-47b7-ae6f-492abc122a20 [ 529.293055] env[59490]: DEBUG nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 529.293209] env[59490]: DEBUG nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 529.397458] env[59490]: INFO nova.scheduler.client.report [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] [req-103ef51e-a9ff-45a3-a4fd-a6eb114ec1bb] Created resource provider record via placement API for resource provider with UUID 715aacdb-6e76-47b7-ae6f-492abc122a20 and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 529.412681] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a50db3b-0d19-4d7f-9109-68a7128d09fd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.420121] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44a6b28c-01f6-48cf-acea-a930eb7e4b2e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.448952] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd1fa1c8-18bc-4a8b-8a50-31e688edcd51 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.455911] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a84127bb-abe4-4b86-bc8c-522202371487 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 529.468679] env[59490]: DEBUG nova.compute.provider_tree [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Updating inventory in ProviderTree for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 529.503341] env[59490]: DEBUG nova.scheduler.client.report [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Updated inventory for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 529.503553] env[59490]: DEBUG nova.compute.provider_tree [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Updating resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20 generation from 0 to 1 during operation: update_inventory {{(pid=59490) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 529.503703] env[59490]: DEBUG nova.compute.provider_tree [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Updating inventory in ProviderTree for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 529.546294] env[59490]: DEBUG nova.compute.provider_tree [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Updating resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20 generation from 1 to 2 during operation: update_traits {{(pid=59490) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 529.565353] env[59490]: DEBUG nova.compute.resource_tracker [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 529.565594] env[59490]: DEBUG oslo_concurrency.lockutils [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.340s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 529.565810] env[59490]: DEBUG nova.service [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Creating RPC server for service compute {{(pid=59490) start /opt/stack/nova/nova/service.py:182}} [ 529.580009] env[59490]: DEBUG nova.service [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] Join ServiceGroup membership for this service compute {{(pid=59490) start /opt/stack/nova/nova/service.py:199}} [ 529.580194] env[59490]: DEBUG nova.servicegroup.drivers.db [None req-72f457bf-78b3-4e95-ba34-78a79503ba28 None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=59490) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 568.561090] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "099ea64f-e4db-467c-a906-6f206c469ea5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.561090] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "099ea64f-e4db-467c-a906-6f206c469ea5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.577745] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 568.678612] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 568.678612] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 568.679312] env[59490]: INFO nova.compute.claims [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 568.797375] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75f90208-bdea-4387-91c3-5ccb36d272a6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.807150] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25697679-b56b-4f9b-bbff-c7d7bb099535 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.852772] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8185a062-e8cf-4cd4-bddb-9cd10833755a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.861476] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04a93c90-b6e2-4865-ad5a-beff3b437602 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 568.878604] env[59490]: DEBUG nova.compute.provider_tree [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 568.890077] env[59490]: DEBUG nova.scheduler.client.report [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 568.920810] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 568.921601] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 568.971891] env[59490]: DEBUG nova.compute.utils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 568.975323] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 568.975594] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 568.995828] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 569.077271] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 569.385296] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "1568985c-6898-4b06-817e-f0354a903771" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.385551] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "1568985c-6898-4b06-817e-f0354a903771" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.404053] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 569.463771] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 569.464017] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 569.466711] env[59490]: INFO nova.compute.claims [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 569.584303] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3d636b53-eed8-4905-b7ed-119de904a1de {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.594410] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-423957e5-4468-42aa-bffc-1ea95c8cdc7b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.631680] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc58cba4-2bbf-44a3-93fa-6ae27eb1440b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.642863] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6c1a5cc-c55e-40a9-bfa5-15a8af1693a6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 569.662021] env[59490]: DEBUG nova.compute.provider_tree [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.679037] env[59490]: DEBUG nova.scheduler.client.report [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 569.701217] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.237s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 569.701702] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 569.754574] env[59490]: DEBUG nova.compute.utils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 569.757807] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Not allocating networking since 'none' was specified. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 569.775253] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 569.877740] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 570.737698] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.737698] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.749272] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 570.828039] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 570.828399] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 570.829891] env[59490]: INFO nova.compute.claims [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 570.979018] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff770ac6-9229-4a57-9510-e139a40416e3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 570.988021] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50d31454-049b-4cb1-bc70-dc63c6da42ed {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.019801] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-266656e6-0f4b-4ccd-9202-e240da8198f4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.027642] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4e1467d-0226-4820-979a-1071a0ffca8b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.042487] env[59490]: DEBUG nova.compute.provider_tree [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 571.054645] env[59490]: DEBUG nova.scheduler.client.report [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 571.074158] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 571.074833] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 571.133467] env[59490]: DEBUG nova.compute.utils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 571.133696] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Not allocating networking since 'none' was specified. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 571.148904] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 571.234741] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 571.502111] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.502111] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.502833] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.502833] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.503579] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.503579] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.503579] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.503579] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.503986] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.505017] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.505017] env[59490]: DEBUG nova.virt.hardware [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.507663] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e08932e3-e220-41d0-ac8d-54a654803e8e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.519899] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cd66764-ec00-4db3-98e4-5d441c80ff58 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.526619] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.526619] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.526619] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.526842] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.526842] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.526842] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.526842] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.526842] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.527034] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.527034] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.527034] env[59490]: DEBUG nova.virt.hardware [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.528389] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0905462e-3d73-4294-838d-99dcd0b35dda {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.548605] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 571.548967] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 571.549611] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 571.549611] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 571.549611] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 571.549611] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 571.549783] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 571.550193] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 571.550193] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 571.552645] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 571.552824] env[59490]: DEBUG nova.virt.hardware [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 571.558185] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b748ace8-f757-42ed-b021-2457c7202611 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.570130] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ea150a-8698-4db0-8a65-35e9ee42a4ec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.586955] env[59490]: DEBUG nova.policy [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e15e3be4fcec42abbb9bc7c416dc6a41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a01faa28c34408a9ec5ee2d02785813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 571.599008] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8334dc2e-104d-4f85-95b8-3025b8944391 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.611530] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b8de4df-3a22-4929-9ab6-92b36626e17c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.624010] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Instance VIF info [] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 571.634141] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.634343] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-21434d48-1ef4-4a1a-9f3d-5cdd877457be {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.644452] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Instance VIF info [] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 571.650132] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.651063] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3364fea0-b323-4ae0-a8a7-f15757b3d2b6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.666951] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Created folder: OpenStack in parent group-v4. [ 571.666951] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating folder: Project (e42a912e823e4509bceb398a9adcdab5). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.667107] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b626452b-ea66-4a21-bdea-8daa18e5d65a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.674020] env[59490]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 571.674199] env[59490]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=59490) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 571.674501] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 571.674676] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating folder: Project (a5abcabf9b1f440e8768fdcf94ac7b2c). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.674891] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c6600509-c445-444e-9a0d-934a6134cc36 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.682505] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Created folder: Project (e42a912e823e4509bceb398a9adcdab5) in parent group-v168905. [ 571.682557] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating folder: Instances. Parent ref: group-v168906. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.682767] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e87cb004-2613-4eec-b895-acbbc05a8d39 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.692167] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Created folder: Project (a5abcabf9b1f440e8768fdcf94ac7b2c) in parent group-v168905. [ 571.692167] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating folder: Instances. Parent ref: group-v168907. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 571.692167] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a1487035-b4ae-44ac-958b-f6415dcb382f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.696963] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Created folder: Instances in parent group-v168906. [ 571.697218] env[59490]: DEBUG oslo.service.loopingcall [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 571.697405] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1568985c-6898-4b06-817e-f0354a903771] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 571.698882] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6720746f-f0aa-4ed1-9baa-b884d092d54b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.711388] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Created folder: Instances in parent group-v168907. [ 571.711620] env[59490]: DEBUG oslo.service.loopingcall [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 571.712206] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 571.712594] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e1631315-0190-47ab-887f-bed0f165c824 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 571.727327] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 571.727327] env[59490]: value = "task-707349" [ 571.727327] env[59490]: _type = "Task" [ 571.727327] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 571.734482] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 571.734482] env[59490]: value = "task-707350" [ 571.734482] env[59490]: _type = "Task" [ 571.734482] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 571.745578] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707349, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 571.752705] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707350, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 572.169511] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Successfully created port: bd089361-7789-4e01-8215-fc4c9260dfc1 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 572.242340] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707349, 'name': CreateVM_Task, 'duration_secs': 0.371565} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 572.249091] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1568985c-6898-4b06-817e-f0354a903771] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.251134] env[59490]: DEBUG oslo_vmware.service [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d12b744a-8def-40ae-889f-59b0087af5bb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.257267] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 572.257454] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 572.258146] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 572.260321] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a98e4e6b-6945-414a-8985-23eed45355ec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.263404] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707350, 'name': CreateVM_Task, 'duration_secs': 0.36343} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 572.265508] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.265508] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 572.272179] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for the task: (returnval){ [ 572.272179] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52562f23-5b55-421f-8e9b-de7af48dd100" [ 572.272179] env[59490]: _type = "Task" [ 572.272179] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 572.283194] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52562f23-5b55-421f-8e9b-de7af48dd100, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 572.787324] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 572.787880] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 572.788812] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 572.792020] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 572.792020] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 572.792020] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 572.792020] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 572.792020] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ed9bbad7-a70c-46cc-8973-3a0d7a69f502 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.793993] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ec5565a3-0b35-4400-974e-489782f83323 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.803453] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for the task: (returnval){ [ 572.803453] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5281ae7b-86ea-876c-5838-478c274eae61" [ 572.803453] env[59490]: _type = "Task" [ 572.803453] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 572.814739] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5281ae7b-86ea-876c-5838-478c274eae61, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 572.816262] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 572.816440] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 572.817239] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fffb496c-8a0b-480a-8cf0-d60f376dcd46 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.827700] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8c56f9bc-7dd1-4364-812a-8c133945dc5a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 572.833451] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for the task: (returnval){ [ 572.833451] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5252f86e-10ec-9536-7355-bd8e78f936fa" [ 572.833451] env[59490]: _type = "Task" [ 572.833451] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 572.843285] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5252f86e-10ec-9536-7355-bd8e78f936fa, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 573.317157] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 573.317157] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 573.317157] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 573.348581] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 573.348581] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating directory with path [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 573.348581] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0cbb7019-c670-4b57-a2eb-3a55ecd13804 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.382329] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Created directory with path [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 573.382667] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Fetch image to [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 573.382990] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 573.384423] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc28bbcb-f41a-4ada-8379-7c94130774ac {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.395753] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c089cce7-68b9-4e39-a808-f93ce379a695 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.417222] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7aaf0f24-667e-4a9f-9573-cd2c24bb631e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.460628] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce8d2b24-ecf9-439a-ad5f-4b29b56bca3e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.470075] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8c1d6f6c-a22a-447f-b896-63d07d8b2747 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 573.555434] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 573.623226] env[59490]: DEBUG oslo_vmware.rw_handles [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 573.693911] env[59490]: DEBUG oslo_vmware.rw_handles [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 573.693911] env[59490]: DEBUG oslo_vmware.rw_handles [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 574.661930] env[59490]: ERROR nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 574.661930] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 574.661930] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 574.661930] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 574.661930] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 574.661930] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 574.661930] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 574.661930] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 574.661930] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 574.661930] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 574.661930] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 574.661930] env[59490]: ERROR nova.compute.manager raise self.value [ 574.661930] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 574.661930] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 574.661930] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 574.661930] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 574.662548] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 574.662548] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 574.662548] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 574.662548] env[59490]: ERROR nova.compute.manager [ 574.662548] env[59490]: Traceback (most recent call last): [ 574.662548] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 574.662548] env[59490]: listener.cb(fileno) [ 574.662548] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.662548] env[59490]: result = function(*args, **kwargs) [ 574.662548] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.662548] env[59490]: return func(*args, **kwargs) [ 574.662548] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 574.662548] env[59490]: raise e [ 574.662548] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 574.662548] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 574.662548] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 574.662548] env[59490]: created_port_ids = self._update_ports_for_instance( [ 574.662548] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 574.662548] env[59490]: with excutils.save_and_reraise_exception(): [ 574.662548] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 574.662548] env[59490]: self.force_reraise() [ 574.662548] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 574.662548] env[59490]: raise self.value [ 574.662548] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 574.662548] env[59490]: updated_port = self._update_port( [ 574.662548] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 574.662548] env[59490]: _ensure_no_port_binding_failure(port) [ 574.662548] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 574.662548] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 574.663770] env[59490]: nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 574.663770] env[59490]: Removing descriptor: 12 [ 574.664846] env[59490]: ERROR nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Traceback (most recent call last): [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] yield resources [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.driver.spawn(context, instance, image_meta, [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] vm_ref = self.build_virtual_machine(instance, [ 574.664846] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] vif_infos = vmwarevif.get_vif_info(self._session, [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] for vif in network_info: [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self._sync_wrapper(fn, *args, **kwargs) [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.wait() [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self[:] = self._gt.wait() [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self._exit_event.wait() [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] result = hub.switch() [ 574.665189] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self.greenlet.switch() [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] result = function(*args, **kwargs) [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return func(*args, **kwargs) [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise e [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] nwinfo = self.network_api.allocate_for_instance( [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] created_port_ids = self._update_ports_for_instance( [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 574.665549] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] with excutils.save_and_reraise_exception(): [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.force_reraise() [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise self.value [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] updated_port = self._update_port( [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] _ensure_no_port_binding_failure(port) [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise exception.PortBindingFailed(port_id=port['id']) [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 574.665902] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] [ 574.666200] env[59490]: INFO nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Terminating instance [ 574.668655] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 574.669011] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquired lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 574.669238] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 574.711034] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquiring lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.711034] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.716314] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 574.720506] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 574.779160] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 574.779160] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 574.780558] env[59490]: INFO nova.compute.claims [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 574.911876] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56cfb1e2-1a0e-4074-82c6-0cb5375f3e18 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.919969] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4fdbaba-3996-42ad-a2a7-88f699f70828 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.952125] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 574.953823] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecaae171-702a-4597-a559-3684ed37a2ea {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.964293] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c9b4fa2a-bf65-4f67-af06-31dfa18730dc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.968230] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Releasing lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 574.968615] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 574.968809] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 574.969451] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-69e72682-09cd-42b9-b4ee-ca6d6bae6615 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.980942] env[59490]: DEBUG nova.compute.provider_tree [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 574.985730] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ce0c457-8f7e-42f3-8419-065278780340 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 574.997332] env[59490]: DEBUG nova.scheduler.client.report [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.012797] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 099ea64f-e4db-467c-a906-6f206c469ea5 could not be found. [ 575.013120] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 575.013724] env[59490]: INFO nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 575.014022] env[59490]: DEBUG oslo.service.loopingcall [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 575.014323] env[59490]: DEBUG nova.compute.manager [-] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 575.014452] env[59490]: DEBUG nova.network.neutron [-] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 575.023211] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.243s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.027451] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 575.034883] env[59490]: DEBUG nova.network.neutron [-] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 575.045343] env[59490]: DEBUG nova.network.neutron [-] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.072413] env[59490]: INFO nova.compute.manager [-] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Took 0.06 seconds to deallocate network for instance. [ 575.078104] env[59490]: DEBUG nova.compute.claims [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 575.078258] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.078393] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.088180] env[59490]: DEBUG nova.compute.utils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 575.089580] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 575.089662] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 575.101035] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 575.239477] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 575.252133] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45f79c5d-c542-467c-b475-a01d38030054 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.260504] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ed0425d-4ca5-4b5a-90c3-a055a7653d51 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.297304] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a792147-1d5f-49a9-8acd-abc0f9fb6823 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.305586] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-253a98b2-8ecc-4d2a-bfc7-b7411af8b8a4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.327676] env[59490]: DEBUG nova.compute.provider_tree [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.339211] env[59490]: DEBUG nova.scheduler.client.report [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.365648] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.286s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.365648] env[59490]: ERROR nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Traceback (most recent call last): [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.driver.spawn(context, instance, image_meta, [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 575.365648] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] vm_ref = self.build_virtual_machine(instance, [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] vif_infos = vmwarevif.get_vif_info(self._session, [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] for vif in network_info: [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self._sync_wrapper(fn, *args, **kwargs) [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.wait() [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self[:] = self._gt.wait() [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self._exit_event.wait() [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 575.365925] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] result = hub.switch() [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return self.greenlet.switch() [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] result = function(*args, **kwargs) [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] return func(*args, **kwargs) [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise e [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] nwinfo = self.network_api.allocate_for_instance( [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] created_port_ids = self._update_ports_for_instance( [ 575.366316] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] with excutils.save_and_reraise_exception(): [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] self.force_reraise() [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise self.value [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] updated_port = self._update_port( [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] _ensure_no_port_binding_failure(port) [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] raise exception.PortBindingFailed(port_id=port['id']) [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] nova.exception.PortBindingFailed: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. [ 575.366611] env[59490]: ERROR nova.compute.manager [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] [ 575.366915] env[59490]: DEBUG nova.compute.utils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 575.374206] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Build of instance 099ea64f-e4db-467c-a906-6f206c469ea5 was re-scheduled: Binding failed for port bd089361-7789-4e01-8215-fc4c9260dfc1, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 575.377825] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 575.377825] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 575.377825] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquired lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 575.377825] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 575.393714] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 575.393941] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 575.394103] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 575.394282] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 575.394421] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 575.394558] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 575.395021] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 575.395021] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 575.395126] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 575.395222] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 575.395387] env[59490]: DEBUG nova.virt.hardware [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 575.396535] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f45eabd1-eda2-432a-8dfa-5d672c6027c7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.406508] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-117eb12f-8b0a-452f-960f-f9265c4d909b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.436995] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.438082] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.450386] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 575.460719] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 575.533111] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 575.533390] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 575.536849] env[59490]: INFO nova.compute.claims [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 575.562429] env[59490]: DEBUG nova.policy [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '54095e15afa741e59a4f2dc71cafb890', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '46a17a267aed4d21993ed0c932c0316e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 575.731845] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3096fb8-083d-4563-8670-42c861f06ef7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.744047] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44145f10-d576-4495-97d5-0ad8b195e189 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.784611] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-391c60d6-d504-46ec-bab4-25b21a693a5c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.793061] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0cd6ce5-3a42-4b3a-85d0-d33f5d9fb1ec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 575.806419] env[59490]: DEBUG nova.compute.provider_tree [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 575.818531] env[59490]: DEBUG nova.scheduler.client.report [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 575.833892] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 575.833892] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 575.885576] env[59490]: DEBUG nova.compute.utils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 575.886885] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 575.891202] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 575.904113] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 575.914589] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.929827] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Releasing lock "refresh_cache-099ea64f-e4db-467c-a906-6f206c469ea5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 575.929827] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 575.929827] env[59490]: DEBUG nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 575.930085] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 575.974743] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 575.988185] env[59490]: DEBUG nova.network.neutron [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 575.993676] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 575.998992] env[59490]: INFO nova.compute.manager [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 099ea64f-e4db-467c-a906-6f206c469ea5] Took 0.07 seconds to deallocate network for instance. [ 576.022536] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 576.022875] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 576.022945] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 576.024269] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 576.024443] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 576.024615] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 576.024830] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 576.025369] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 576.025369] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 576.025369] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 576.025512] env[59490]: DEBUG nova.virt.hardware [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 576.026364] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6778f03b-3453-4709-a47d-362e75bf5815 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.041018] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9dcf478-3dad-45f8-ba72-b5b1217441dd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.130084] env[59490]: INFO nova.scheduler.client.report [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Deleted allocations for instance 099ea64f-e4db-467c-a906-6f206c469ea5 [ 576.171023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-ccfb02db-6599-46fc-ae8e-7e9a251d620a tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "099ea64f-e4db-467c-a906-6f206c469ea5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.606s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 576.193019] env[59490]: DEBUG nova.policy [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '755c786e905f4b2eadd0fa0e21f9dc4d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54bb8db5812744c5bb0529c5a674abf8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 576.483061] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquiring lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 576.483061] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 576.505612] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 576.570381] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 576.570628] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 576.572166] env[59490]: INFO nova.compute.claims [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 576.749786] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b648dc5a-056f-4864-93d9-ccfdcbc5aebe {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.757047] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdc618e0-56ac-43d4-b74f-26e1da9fd6fa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.789842] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a87b095e-8d6a-49cb-aade-6820aabd1931 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.797862] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2b0e1218-abe8-4357-b674-2b67e364e726 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 576.811568] env[59490]: DEBUG nova.compute.provider_tree [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 576.826118] env[59490]: DEBUG nova.scheduler.client.report [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 576.844315] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 576.844315] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 576.882267] env[59490]: DEBUG nova.compute.utils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 576.884252] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 576.887572] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 576.900469] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 577.002371] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 577.029540] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 577.029768] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 577.029915] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 577.030107] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 577.030248] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 577.030380] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 577.030583] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 577.030732] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 577.030890] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 577.031209] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 577.031306] env[59490]: DEBUG nova.virt.hardware [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 577.032166] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a534d3d-6a94-4c53-897d-587bd033467d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.040652] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46701f99-3d06-4971-9db5-0f8940faae4f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 577.183743] env[59490]: DEBUG nova.policy [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f0c937cac44c42789931155c4b8e4fbe', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '519fe441329d49bd9535d262789e08a3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 577.425791] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Successfully created port: 27df7c6c-7951-4e58-bd82-049e4cd43ced {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 578.169959] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Successfully created port: 4cba3e78-0e4d-4649-a9a3-cd422d41fe60 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 578.550559] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Successfully created port: 7dff2453-13d9-4d1c-8285-59875beb57a8 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 580.582770] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 580.602673] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Getting list of instances from cluster (obj){ [ 580.602673] env[59490]: value = "domain-c8" [ 580.602673] env[59490]: _type = "ClusterComputeResource" [ 580.602673] env[59490]: } {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 580.603947] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f072310d-1170-4122-a2e3-f461ce4bbaad {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 580.615185] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Got total of 2 instances {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 580.615458] env[59490]: WARNING nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] While synchronizing instance power states, found 5 instances in the database and 2 instances on the hypervisor. [ 580.615589] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid 1568985c-6898-4b06-817e-f0354a903771 {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 580.615772] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid e9f81c59-44ea-4276-a310-7581e3a7abb1 {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 580.615918] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8 {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 580.616073] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid c4085b74-1e83-4d1a-b0d8-963e97f93eff {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 580.616243] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid 39e4603f-1f38-49bd-bbbc-dbfd63961766 {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 580.616535] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "1568985c-6898-4b06-817e-f0354a903771" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.616746] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.616919] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.617109] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.617284] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 580.617692] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 580.618067] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Getting list of instances from cluster (obj){ [ 580.618067] env[59490]: value = "domain-c8" [ 580.618067] env[59490]: _type = "ClusterComputeResource" [ 580.618067] env[59490]: } {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 580.618950] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81ed4968-e32d-4fa5-818f-3f1e17641daa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 580.630417] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Got total of 2 instances {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 581.824683] env[59490]: ERROR nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 581.824683] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 581.824683] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 581.824683] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 581.824683] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 581.824683] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 581.824683] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 581.824683] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 581.824683] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 581.824683] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 581.824683] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 581.824683] env[59490]: ERROR nova.compute.manager raise self.value [ 581.824683] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 581.824683] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 581.824683] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 581.824683] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 581.825353] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 581.825353] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 581.825353] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 581.825353] env[59490]: ERROR nova.compute.manager [ 581.825353] env[59490]: Traceback (most recent call last): [ 581.825353] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 581.825353] env[59490]: listener.cb(fileno) [ 581.825353] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 581.825353] env[59490]: result = function(*args, **kwargs) [ 581.825353] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 581.825353] env[59490]: return func(*args, **kwargs) [ 581.825353] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 581.825353] env[59490]: raise e [ 581.825353] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 581.825353] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 581.825353] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 581.825353] env[59490]: created_port_ids = self._update_ports_for_instance( [ 581.825353] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 581.825353] env[59490]: with excutils.save_and_reraise_exception(): [ 581.825353] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 581.825353] env[59490]: self.force_reraise() [ 581.825353] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 581.825353] env[59490]: raise self.value [ 581.825353] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 581.825353] env[59490]: updated_port = self._update_port( [ 581.825353] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 581.825353] env[59490]: _ensure_no_port_binding_failure(port) [ 581.825353] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 581.825353] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 581.827151] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 581.827151] env[59490]: Removing descriptor: 12 [ 581.827151] env[59490]: ERROR nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Traceback (most recent call last): [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] yield resources [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.driver.spawn(context, instance, image_meta, [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 581.827151] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] vm_ref = self.build_virtual_machine(instance, [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] vif_infos = vmwarevif.get_vif_info(self._session, [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] for vif in network_info: [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self._sync_wrapper(fn, *args, **kwargs) [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.wait() [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self[:] = self._gt.wait() [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self._exit_event.wait() [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 581.827447] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] result = hub.switch() [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self.greenlet.switch() [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] result = function(*args, **kwargs) [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return func(*args, **kwargs) [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise e [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] nwinfo = self.network_api.allocate_for_instance( [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] created_port_ids = self._update_ports_for_instance( [ 581.827843] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] with excutils.save_and_reraise_exception(): [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.force_reraise() [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise self.value [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] updated_port = self._update_port( [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] _ensure_no_port_binding_failure(port) [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise exception.PortBindingFailed(port_id=port['id']) [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 581.828163] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] [ 581.828549] env[59490]: INFO nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Terminating instance [ 581.833256] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquiring lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 581.833345] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquired lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 581.833456] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 581.878739] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.223850] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 582.241047] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Releasing lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 582.241984] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 582.242195] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 582.242792] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2d1b4e2e-8801-49d5-a771-1559adc44e51 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.258271] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a949bd13-f5b9-4405-ba9a-349790cb2674 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.289725] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8 could not be found. [ 582.290025] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 582.294457] env[59490]: INFO nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Took 0.05 seconds to destroy the instance on the hypervisor. [ 582.294457] env[59490]: DEBUG oslo.service.loopingcall [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 582.294457] env[59490]: DEBUG nova.compute.manager [-] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 582.294457] env[59490]: DEBUG nova.network.neutron [-] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 582.356277] env[59490]: DEBUG nova.network.neutron [-] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.366238] env[59490]: DEBUG nova.network.neutron [-] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 582.384322] env[59490]: INFO nova.compute.manager [-] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Took 0.09 seconds to deallocate network for instance. [ 582.386782] env[59490]: DEBUG nova.compute.claims [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 582.386942] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 582.387175] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 582.565884] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ead8f115-00b1-4a6a-8443-3ba04e030f55 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.577144] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5055a9a3-c19e-4659-90da-e26ce40545a3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.615464] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a68ffa6b-9e95-456d-9eec-2cb941d489a9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.623637] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a20d3699-6624-4821-a1a9-cbb0bc2f3cb3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 582.643792] env[59490]: DEBUG nova.compute.provider_tree [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 582.675212] env[59490]: DEBUG nova.scheduler.client.report [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 582.699358] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.312s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 582.699947] env[59490]: ERROR nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Traceback (most recent call last): [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.driver.spawn(context, instance, image_meta, [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] vm_ref = self.build_virtual_machine(instance, [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] vif_infos = vmwarevif.get_vif_info(self._session, [ 582.699947] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] for vif in network_info: [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self._sync_wrapper(fn, *args, **kwargs) [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.wait() [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self[:] = self._gt.wait() [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self._exit_event.wait() [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] result = hub.switch() [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return self.greenlet.switch() [ 582.700280] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] result = function(*args, **kwargs) [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] return func(*args, **kwargs) [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise e [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] nwinfo = self.network_api.allocate_for_instance( [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] created_port_ids = self._update_ports_for_instance( [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] with excutils.save_and_reraise_exception(): [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 582.700628] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] self.force_reraise() [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise self.value [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] updated_port = self._update_port( [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] _ensure_no_port_binding_failure(port) [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] raise exception.PortBindingFailed(port_id=port['id']) [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] nova.exception.PortBindingFailed: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. [ 582.700941] env[59490]: ERROR nova.compute.manager [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] [ 582.700941] env[59490]: DEBUG nova.compute.utils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 582.704866] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Build of instance 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8 was re-scheduled: Binding failed for port 27df7c6c-7951-4e58-bd82-049e4cd43ced, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 582.704866] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 582.704866] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquiring lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.704866] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Acquired lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 582.705059] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 582.803210] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 582.837243] env[59490]: ERROR nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 582.837243] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 582.837243] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 582.837243] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 582.837243] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 582.837243] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 582.837243] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 582.837243] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 582.837243] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 582.837243] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 582.837243] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 582.837243] env[59490]: ERROR nova.compute.manager raise self.value [ 582.837243] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 582.837243] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 582.837243] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 582.837243] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 582.837852] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 582.837852] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 582.837852] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 582.837852] env[59490]: ERROR nova.compute.manager [ 582.837852] env[59490]: Traceback (most recent call last): [ 582.837852] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 582.837852] env[59490]: listener.cb(fileno) [ 582.837852] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 582.837852] env[59490]: result = function(*args, **kwargs) [ 582.837852] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 582.837852] env[59490]: return func(*args, **kwargs) [ 582.837852] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 582.837852] env[59490]: raise e [ 582.837852] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 582.837852] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 582.837852] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 582.837852] env[59490]: created_port_ids = self._update_ports_for_instance( [ 582.837852] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 582.837852] env[59490]: with excutils.save_and_reraise_exception(): [ 582.837852] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 582.837852] env[59490]: self.force_reraise() [ 582.837852] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 582.837852] env[59490]: raise self.value [ 582.837852] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 582.837852] env[59490]: updated_port = self._update_port( [ 582.837852] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 582.837852] env[59490]: _ensure_no_port_binding_failure(port) [ 582.837852] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 582.837852] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 582.838639] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 582.838639] env[59490]: Removing descriptor: 16 [ 582.838639] env[59490]: ERROR nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Traceback (most recent call last): [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] yield resources [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.driver.spawn(context, instance, image_meta, [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 582.838639] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] vm_ref = self.build_virtual_machine(instance, [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] vif_infos = vmwarevif.get_vif_info(self._session, [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] for vif in network_info: [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self._sync_wrapper(fn, *args, **kwargs) [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.wait() [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self[:] = self._gt.wait() [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self._exit_event.wait() [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 582.839021] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] result = hub.switch() [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self.greenlet.switch() [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] result = function(*args, **kwargs) [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return func(*args, **kwargs) [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise e [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] nwinfo = self.network_api.allocate_for_instance( [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] created_port_ids = self._update_ports_for_instance( [ 582.839378] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] with excutils.save_and_reraise_exception(): [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.force_reraise() [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise self.value [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] updated_port = self._update_port( [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] _ensure_no_port_binding_failure(port) [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise exception.PortBindingFailed(port_id=port['id']) [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 582.839765] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] [ 582.840153] env[59490]: INFO nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Terminating instance [ 582.840458] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 582.840600] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquired lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 582.840756] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 582.894159] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.015030] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.025548] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Releasing lock "refresh_cache-7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 583.025835] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 583.026026] env[59490]: DEBUG nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 583.026171] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 583.224056] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.237359] env[59490]: DEBUG nova.network.neutron [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.249255] env[59490]: INFO nova.compute.manager [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] Took 0.22 seconds to deallocate network for instance. [ 583.261495] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.271885] env[59490]: ERROR nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 583.271885] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 583.271885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 583.271885] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 583.271885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 583.271885] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 583.271885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 583.271885] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 583.271885] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 583.271885] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 583.271885] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 583.271885] env[59490]: ERROR nova.compute.manager raise self.value [ 583.271885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 583.271885] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 583.271885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 583.271885] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 583.273828] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 583.273828] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 583.273828] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 583.273828] env[59490]: ERROR nova.compute.manager [ 583.273828] env[59490]: Traceback (most recent call last): [ 583.273828] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 583.273828] env[59490]: listener.cb(fileno) [ 583.273828] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 583.273828] env[59490]: result = function(*args, **kwargs) [ 583.273828] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 583.273828] env[59490]: return func(*args, **kwargs) [ 583.273828] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 583.273828] env[59490]: raise e [ 583.273828] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 583.273828] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 583.273828] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 583.273828] env[59490]: created_port_ids = self._update_ports_for_instance( [ 583.273828] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 583.273828] env[59490]: with excutils.save_and_reraise_exception(): [ 583.273828] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 583.273828] env[59490]: self.force_reraise() [ 583.273828] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 583.273828] env[59490]: raise self.value [ 583.273828] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 583.273828] env[59490]: updated_port = self._update_port( [ 583.273828] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 583.273828] env[59490]: _ensure_no_port_binding_failure(port) [ 583.273828] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 583.273828] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 583.275926] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 583.275926] env[59490]: Removing descriptor: 17 [ 583.275926] env[59490]: ERROR nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Traceback (most recent call last): [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] yield resources [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.driver.spawn(context, instance, image_meta, [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self._vmops.spawn(context, instance, image_meta, injected_files, [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 583.275926] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] vm_ref = self.build_virtual_machine(instance, [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] vif_infos = vmwarevif.get_vif_info(self._session, [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] for vif in network_info: [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self._sync_wrapper(fn, *args, **kwargs) [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.wait() [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self[:] = self._gt.wait() [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self._exit_event.wait() [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 583.276798] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] result = hub.switch() [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self.greenlet.switch() [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] result = function(*args, **kwargs) [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return func(*args, **kwargs) [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise e [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] nwinfo = self.network_api.allocate_for_instance( [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] created_port_ids = self._update_ports_for_instance( [ 583.277614] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] with excutils.save_and_reraise_exception(): [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.force_reraise() [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise self.value [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] updated_port = self._update_port( [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] _ensure_no_port_binding_failure(port) [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise exception.PortBindingFailed(port_id=port['id']) [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 583.278113] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] [ 583.278643] env[59490]: INFO nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Terminating instance [ 583.278643] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Releasing lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 583.278643] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 583.278643] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 583.278643] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquiring lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 583.278844] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquired lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 583.278844] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 583.279702] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e01a48df-bdfc-4020-9063-917155587ae9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.289864] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e622581-88c0-4be2-b36d-efdcdda436e5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.314385] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c4085b74-1e83-4d1a-b0d8-963e97f93eff could not be found. [ 583.314544] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 583.314720] env[59490]: INFO nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Took 0.04 seconds to destroy the instance on the hypervisor. [ 583.315702] env[59490]: DEBUG oslo.service.loopingcall [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 583.315946] env[59490]: DEBUG nova.compute.manager [-] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 583.316094] env[59490]: DEBUG nova.network.neutron [-] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 583.356650] env[59490]: DEBUG nova.network.neutron [-] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.361532] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.366508] env[59490]: DEBUG nova.network.neutron [-] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.377493] env[59490]: INFO nova.compute.manager [-] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Took 0.06 seconds to deallocate network for instance. [ 583.377919] env[59490]: DEBUG nova.compute.claims [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 583.378428] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.378428] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.412044] env[59490]: INFO nova.scheduler.client.report [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Deleted allocations for instance 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8 [ 583.433239] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b111242b-6a84-4fbe-ab82-e2b67223ba51 tempest-ServerDiagnosticsTest-572015104 tempest-ServerDiagnosticsTest-572015104-project-member] Lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.725s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.434455] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 2.817s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.434455] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8] During sync_power_state the instance has a pending task (spawning). Skip. [ 583.434455] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "7cbfc3f4-5bb3-45db-8be0-bbb926ead3f8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.588060] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ecdc3ad-af42-4eff-9a6c-3f92cc079e4f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.598441] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c595947f-0274-4838-9a1c-c66c29c7457f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.632107] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.633781] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f31707d-4552-45ad-8148-116f73743b16 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.641975] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f236817-2919-4254-b4fb-f9ebe73d6f86 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.657880] env[59490]: DEBUG nova.compute.provider_tree [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 583.664136] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Releasing lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 583.664136] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 583.664313] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 583.665337] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b6ea5fa-7038-4692-993c-60c0cc3ba51c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.675455] env[59490]: DEBUG nova.scheduler.client.report [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 583.685036] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e9df683-2577-48e4-837d-63bb97dd46dd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.702658] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.324s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 583.703270] env[59490]: ERROR nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Traceback (most recent call last): [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.driver.spawn(context, instance, image_meta, [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self._vmops.spawn(context, instance, image_meta, injected_files, [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] vm_ref = self.build_virtual_machine(instance, [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] vif_infos = vmwarevif.get_vif_info(self._session, [ 583.703270] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] for vif in network_info: [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self._sync_wrapper(fn, *args, **kwargs) [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.wait() [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self[:] = self._gt.wait() [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self._exit_event.wait() [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] result = hub.switch() [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return self.greenlet.switch() [ 583.703637] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] result = function(*args, **kwargs) [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] return func(*args, **kwargs) [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise e [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] nwinfo = self.network_api.allocate_for_instance( [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] created_port_ids = self._update_ports_for_instance( [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] with excutils.save_and_reraise_exception(): [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 583.703990] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] self.force_reraise() [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise self.value [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] updated_port = self._update_port( [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] _ensure_no_port_binding_failure(port) [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] raise exception.PortBindingFailed(port_id=port['id']) [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] nova.exception.PortBindingFailed: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. [ 583.704335] env[59490]: ERROR nova.compute.manager [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] [ 583.704572] env[59490]: DEBUG nova.compute.utils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 583.706045] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Build of instance c4085b74-1e83-4d1a-b0d8-963e97f93eff was re-scheduled: Binding failed for port 4cba3e78-0e4d-4649-a9a3-cd422d41fe60, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 583.707026] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 583.707026] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 583.707026] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquired lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 583.707026] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 583.719613] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 39e4603f-1f38-49bd-bbbc-dbfd63961766 could not be found. [ 583.719613] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 583.719728] env[59490]: INFO nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Took 0.06 seconds to destroy the instance on the hypervisor. [ 583.719910] env[59490]: DEBUG oslo.service.loopingcall [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 583.721331] env[59490]: DEBUG nova.compute.manager [-] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 583.721331] env[59490]: DEBUG nova.network.neutron [-] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 583.762336] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.766119] env[59490]: DEBUG nova.network.neutron [-] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 583.774066] env[59490]: DEBUG nova.network.neutron [-] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.787411] env[59490]: INFO nova.compute.manager [-] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Took 0.07 seconds to deallocate network for instance. [ 583.789337] env[59490]: DEBUG nova.compute.claims [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 583.789496] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 583.789790] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 583.889051] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 583.904732] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Releasing lock "refresh_cache-c4085b74-1e83-4d1a-b0d8-963e97f93eff" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 583.905120] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 583.905385] env[59490]: DEBUG nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 583.905634] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 583.929473] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b733956a-aa7f-47a8-bf7f-6d8102e0be69 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.940198] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b54062d2-c74e-451b-9d5e-40c075f8d88f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.978170] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdb5d051-661f-4715-adb8-10911ba92bce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 583.984403] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eb93131-dcbf-4308-9591-79036d212c3b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 584.000767] env[59490]: DEBUG nova.compute.provider_tree [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 584.017699] env[59490]: DEBUG nova.scheduler.client.report [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 584.025061] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 584.031794] env[59490]: DEBUG nova.network.neutron [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.041951] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.252s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.045018] env[59490]: ERROR nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Traceback (most recent call last): [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.driver.spawn(context, instance, image_meta, [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self._vmops.spawn(context, instance, image_meta, injected_files, [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] vm_ref = self.build_virtual_machine(instance, [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] vif_infos = vmwarevif.get_vif_info(self._session, [ 584.045018] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] for vif in network_info: [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self._sync_wrapper(fn, *args, **kwargs) [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.wait() [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self[:] = self._gt.wait() [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self._exit_event.wait() [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] result = hub.switch() [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return self.greenlet.switch() [ 584.045516] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] result = function(*args, **kwargs) [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] return func(*args, **kwargs) [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise e [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] nwinfo = self.network_api.allocate_for_instance( [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] created_port_ids = self._update_ports_for_instance( [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] with excutils.save_and_reraise_exception(): [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 584.045870] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] self.force_reraise() [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise self.value [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] updated_port = self._update_port( [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] _ensure_no_port_binding_failure(port) [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] raise exception.PortBindingFailed(port_id=port['id']) [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] nova.exception.PortBindingFailed: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. [ 584.046219] env[59490]: ERROR nova.compute.manager [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] [ 584.046532] env[59490]: DEBUG nova.compute.utils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 584.049178] env[59490]: INFO nova.compute.manager [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] Took 0.14 seconds to deallocate network for instance. [ 584.050070] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Build of instance 39e4603f-1f38-49bd-bbbc-dbfd63961766 was re-scheduled: Binding failed for port 7dff2453-13d9-4d1c-8285-59875beb57a8, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 584.050523] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 584.050751] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquiring lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 584.051762] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Acquired lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 584.051865] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 584.111602] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 584.169196] env[59490]: INFO nova.scheduler.client.report [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Deleted allocations for instance c4085b74-1e83-4d1a-b0d8-963e97f93eff [ 584.191988] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9d242ca5-94bf-49be-85fd-0caa82589396 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.755s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.192351] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 3.575s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.192404] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: c4085b74-1e83-4d1a-b0d8-963e97f93eff] During sync_power_state the instance has a pending task (spawning). Skip. [ 584.192522] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "c4085b74-1e83-4d1a-b0d8-963e97f93eff" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.378559] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.389087] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Releasing lock "refresh_cache-39e4603f-1f38-49bd-bbbc-dbfd63961766" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 584.389309] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 584.389476] env[59490]: DEBUG nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 584.389626] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 584.423610] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 584.434329] env[59490]: DEBUG nova.network.neutron [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 584.448510] env[59490]: INFO nova.compute.manager [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] Took 0.06 seconds to deallocate network for instance. [ 584.562932] env[59490]: INFO nova.scheduler.client.report [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Deleted allocations for instance 39e4603f-1f38-49bd-bbbc-dbfd63961766 [ 584.590145] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1ff13257-c709-46a6-81d9-31b6e1bcc466 tempest-AttachInterfacesUnderV243Test-452483955 tempest-AttachInterfacesUnderV243Test-452483955-project-member] Lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 8.107s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 584.590513] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 3.973s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 584.590731] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 39e4603f-1f38-49bd-bbbc-dbfd63961766] During sync_power_state the instance has a pending task (spawning). Skip. [ 584.590789] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "39e4603f-1f38-49bd-bbbc-dbfd63961766" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.417520] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.420257] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.420257] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 585.420257] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 585.435927] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 1568985c-6898-4b06-817e-f0354a903771] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 585.436157] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 585.436268] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 585.436976] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.437749] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.437749] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.437749] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.437889] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.438067] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.438227] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 585.438364] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 585.449428] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.449774] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.449847] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.450098] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 585.451369] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4131824-bedc-4bf4-ba61-21026c4f5717 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.461407] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3577a1d-612d-43ff-a9fd-caf573d70ff4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.479717] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-197ce302-519a-4881-9313-a84f91e8036a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.488161] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bfbf9d1-7249-41d6-928f-d05013dd1794 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.541764] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181648MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 585.541924] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.542133] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.618372] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 1568985c-6898-4b06-817e-f0354a903771 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.618372] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e9f81c59-44ea-4276-a310-7581e3a7abb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 585.618372] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 585.618372] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 585.662858] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-841ddfb0-b621-4890-9901-bae3cbe9a99a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.675446] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68198380-d1f6-4818-b53d-cf821b5852e1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.709647] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57dc9eed-b808-4bf3-82fd-54755f2c5a78 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.717840] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08861809-318c-4f08-8bc5-070b34d87468 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 585.740524] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 585.749407] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 585.780545] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 585.780726] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.239s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 585.933806] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquiring lock "aa569569-2ead-4d30-8416-ea2b3e78c212" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 585.934150] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "aa569569-2ead-4d30-8416-ea2b3e78c212" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 585.974198] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 586.053637] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.053877] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.055417] env[59490]: INFO nova.compute.claims [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 586.282254] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a2ccddb-4b26-4a84-b471-4f013400e242 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.292446] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-04420803-8f38-461b-a61e-4a61daf55b1d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.337364] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b605db49-ab1d-4820-bf65-f7b7bfea5027 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.347884] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82bb2f38-f81f-49bf-9277-1bc55f663cd6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.377360] env[59490]: DEBUG nova.compute.provider_tree [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 586.391302] env[59490]: DEBUG nova.scheduler.client.report [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 586.414792] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 586.415438] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 586.465703] env[59490]: DEBUG nova.compute.utils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 586.468228] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 586.468449] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 586.486974] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 586.606466] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 586.644594] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 586.644594] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 586.644594] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 586.646347] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 586.646622] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 586.646787] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 586.647294] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 586.647294] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 586.647392] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 586.647542] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 586.647858] env[59490]: DEBUG nova.virt.hardware [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 586.651022] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "67614b5d-b125-4f54-b0e4-4a840a186fe3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.651022] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "67614b5d-b125-4f54-b0e4-4a840a186fe3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.654395] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-073b3d0a-951a-41c4-af06-f9c00d966e5c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.665658] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a13d35c-f5b5-4866-973c-ab224331f667 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.675289] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 586.738810] env[59490]: DEBUG nova.policy [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9109de47200b49b591a45093387de08a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '966ebcf512b34c97b3bef7a3a90ae80c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 586.761527] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 586.761527] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 586.762432] env[59490]: INFO nova.compute.claims [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 586.927290] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-161bfdf2-a79f-4c85-a5eb-598d17d8de32 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.936332] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce54b72b-7bcd-4d51-8ef4-f297461d67f4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.976150] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32f3f981-a413-4b32-9aea-45b2d1c107d0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 586.987459] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fea2c947-d40d-4d0e-a76c-52d82be649d0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.009049] env[59490]: DEBUG nova.compute.provider_tree [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 587.030559] env[59490]: DEBUG nova.scheduler.client.report [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 587.053281] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 587.054762] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 587.100451] env[59490]: DEBUG nova.compute.utils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 587.105295] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 587.105561] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 587.129454] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 587.242378] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 587.272354] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:20:37Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='6d2c930f-2c63-4f10-8da4-e7b133a88e18',id=22,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1077663474',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 587.272598] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 587.272771] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 587.273019] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 587.273106] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 587.273253] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 587.273487] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 587.273601] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 587.273758] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 587.273914] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 587.274096] env[59490]: DEBUG nova.virt.hardware [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 587.274982] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2119385-bd2c-47b4-b589-072320c06c93 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.284667] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3772a58-cb23-4116-9878-8a1d0d1ba17f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 587.323576] env[59490]: DEBUG nova.policy [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e15e3be4fcec42abbb9bc7c416dc6a41', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4a01faa28c34408a9ec5ee2d02785813', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 587.682860] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Successfully created port: f85a54dc-88ec-442e-bba6-b0bd7f51a62f {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 587.932047] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "de330988-9a6c-43ff-a8ab-25bc5c5a7a51" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 587.932047] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "de330988-9a6c-43ff-a8ab-25bc5c5a7a51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 587.955897] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 588.017227] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 588.018716] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 588.020513] env[59490]: INFO nova.compute.claims [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 588.166825] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81b7806c-3b31-43e6-ba8e-36b2422f2e0d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.175474] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-127a42fb-a5c9-4262-b5b4-6eb93b129e2e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.210982] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eda55630-17d4-4f6c-b4e3-d29c4188beed {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.220640] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08565137-d8d7-468b-aa60-35305d1d4abc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.235048] env[59490]: DEBUG nova.compute.provider_tree [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 588.244817] env[59490]: DEBUG nova.scheduler.client.report [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 588.266267] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.247s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 588.266267] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 588.326224] env[59490]: DEBUG nova.compute.utils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 588.330676] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 588.330893] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 588.338606] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 588.437037] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 588.467176] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 588.467176] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 588.467176] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 588.467438] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 588.467438] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 588.467438] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 588.467527] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 588.467649] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 588.467810] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 588.468022] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 588.468148] env[59490]: DEBUG nova.virt.hardware [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 588.469525] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9742b80f-3dcd-4266-966d-5e49fba622c4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.480030] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-561e3af5-23db-4112-b097-9e46861d2062 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 588.814889] env[59490]: DEBUG nova.policy [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '755c786e905f4b2eadd0fa0e21f9dc4d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '54bb8db5812744c5bb0529c5a674abf8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 588.917778] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Successfully created port: 83a1fd10-6eb0-4aeb-837f-b808b1d579bb {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 590.477683] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Successfully created port: 9f60bb7e-5421-4ff8-8830-8253b023fe57 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 591.998174] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquiring lock "84594817-90c9-4c87-b856-d0340b0d4972" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 591.998174] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "84594817-90c9-4c87-b856-d0340b0d4972" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.012537] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 592.099947] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.099947] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.101870] env[59490]: INFO nova.compute.claims [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.269638] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-904010c1-cbef-441a-8e14-45cbe183f871 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.281665] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f42d7a2a-bd91-4e8c-af6a-c2898378c9df {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.334289] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7313b849-c0a8-4d14-9108-c8872853777f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.342956] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee8e660d-f026-4966-8f15-f531fb639de9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.360655] env[59490]: DEBUG nova.compute.provider_tree [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 592.373935] env[59490]: DEBUG nova.scheduler.client.report [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 592.402969] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 592.402969] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 592.447844] env[59490]: DEBUG nova.compute.utils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 592.454839] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 592.454839] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 592.474927] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 592.589925] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 592.634045] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "31c074c3-93cf-4f48-b003-253fc5405e35" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.634520] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "31c074c3-93cf-4f48-b003-253fc5405e35" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.648421] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 592.648421] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 592.648421] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 592.648646] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 592.648646] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 592.648646] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 592.648646] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 592.648784] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 592.649521] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 592.649521] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 592.649521] env[59490]: DEBUG nova.virt.hardware [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 592.650510] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d85ea3a-92b6-4dfe-bb0f-7adb17cee7c1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.654811] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 592.663229] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-337a8337-9b2e-4150-ad2e-8538f1c2b04b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.724549] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 592.724549] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 592.724705] env[59490]: INFO nova.compute.claims [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 592.802998] env[59490]: DEBUG nova.policy [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77864ebd47674dc6a77ed1fbe7e80d59', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3764d36b782143a6bc022de0caef4e35', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 592.911769] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d73324d-59b2-486e-9685-4f607117d793 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.922565] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9d1f5ba-5672-40d9-8fbe-ff770f80c703 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.962218] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dc90db7-c1ee-4c1c-9713-a86254382855 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.971224] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c2e582e-475a-4ae1-98a4-e1a09f20f1f0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 592.985326] env[59490]: DEBUG nova.compute.provider_tree [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 593.000359] env[59490]: DEBUG nova.scheduler.client.report [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 593.020270] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.298s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 593.021013] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 593.069968] env[59490]: DEBUG nova.compute.utils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 593.071283] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Not allocating networking since 'none' was specified. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 593.089877] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 593.182410] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 593.206061] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 593.206316] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 593.206469] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 593.206637] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 593.206802] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 593.208554] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 593.208704] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 593.208800] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 593.209033] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 593.209221] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 593.209489] env[59490]: DEBUG nova.virt.hardware [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 593.212700] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-860cee56-edf6-4a7c-8a37-6f32126cc501 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.223168] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-caf67334-5629-43ac-8c64-2da8ad5aaf64 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.241383] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Instance VIF info [] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 593.252280] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Creating folder: Project (c76889d8ac894bd2a1408277d73009cf). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.252280] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-185a8697-3361-44ec-9363-6fd515a024ce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.264380] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Created folder: Project (c76889d8ac894bd2a1408277d73009cf) in parent group-v168905. [ 593.264380] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Creating folder: Instances. Parent ref: group-v168916. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 593.264380] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d19e7cfa-cd54-4c1a-83d1-6abb15c67402 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.274397] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Created folder: Instances in parent group-v168916. [ 593.274622] env[59490]: DEBUG oslo.service.loopingcall [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 593.274798] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 593.274997] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-05861e55-ad1c-430e-81ed-0e7536543869 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.293450] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 593.293450] env[59490]: value = "task-707363" [ 593.293450] env[59490]: _type = "Task" [ 593.293450] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 593.302344] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707363, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 593.807882] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707363, 'name': CreateVM_Task, 'duration_secs': 0.236907} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 593.808101] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 593.809297] env[59490]: DEBUG oslo_vmware.service [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8f4a410-055c-4981-8ec9-ec3f4a208fb6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.823959] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 593.823959] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 593.823959] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 593.823959] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a705ba52-307d-4c97-8d8f-e9227726dfcf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 593.827175] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for the task: (returnval){ [ 593.827175] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]525893be-673e-a845-fd56-da06542f57f6" [ 593.827175] env[59490]: _type = "Task" [ 593.827175] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 593.838957] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]525893be-673e-a845-fd56-da06542f57f6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 594.182699] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquiring lock "760b4e7a-17ed-45c7-a7df-5698c9a358b6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.182967] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "760b4e7a-17ed-45c7-a7df-5698c9a358b6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.199029] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 594.272967] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 594.273200] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 594.276666] env[59490]: INFO nova.compute.claims [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 594.348185] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 594.348439] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 594.348659] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 594.348791] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquired lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 594.348954] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 594.349206] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b58e3de-e002-46e7-a869-71b9be06ba19 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.367623] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 594.367623] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 594.367623] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-469cfaf3-e832-4b4e-9220-acad5820d0f7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.378383] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80bb58de-8123-42bc-a139-940e10cc05a5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.384616] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for the task: (returnval){ [ 594.384616] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52cb018b-e7eb-f47d-1065-943d8cb63c7f" [ 594.384616] env[59490]: _type = "Task" [ 594.384616] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 594.396140] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52cb018b-e7eb-f47d-1065-943d8cb63c7f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 594.523355] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3efc13a9-816d-4960-ba4d-8d298f04ced9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.533098] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e60cc72b-51a3-43d2-8a7d-7048b6593fd9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.572993] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1bdf36bf-a26f-48de-9a34-c7df4d302a8b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.581874] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-815b31cd-41d5-41bd-a67c-2c508e4fe7a9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.597513] env[59490]: DEBUG nova.compute.provider_tree [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 594.606942] env[59490]: DEBUG nova.scheduler.client.report [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 594.623452] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 594.624065] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 594.665526] env[59490]: DEBUG nova.compute.utils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 594.667593] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 594.667730] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 594.675763] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 594.759114] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 594.788518] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:20:39Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1469361381',id=25,is_public=True,memory_mb=128,name='tempest-flavor_with_ephemeral_0-924223850',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 594.788666] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 594.788817] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 594.788995] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 594.789163] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 594.789305] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 594.789711] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 594.789711] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 594.789849] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 594.789991] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 594.790191] env[59490]: DEBUG nova.virt.hardware [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 594.791132] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2f7487c-9fbc-4caf-a23d-64c805022dc4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.799991] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3983a4a3-7036-4957-b56a-7f068791fe42 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.895428] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 594.895693] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Creating directory with path [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 594.895914] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-019a970c-6645-4c0c-823b-f3020c43eba8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.901074] env[59490]: DEBUG nova.policy [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5fb2c8dcfa9c452cbf58afeb58eb5c24', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a5920c27a9ab43018f29e2dac9df2023', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 594.936791] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Created directory with path [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 594.936791] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Fetch image to [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 594.936791] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore1 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 594.937850] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb435561-272b-4222-8b3f-08564c6de459 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.947546] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce2b8791-b490-4520-a3ca-3f43503fefb2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.960459] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d210147-60e2-4c3b-9b8b-deff3b9f7dcb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 594.999028] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4eb6c4dc-0803-41ef-8763-55992f8efdfc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.005431] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-79bcee22-05f5-4e68-b177-4c9f3be03c41 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 595.095349] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore1 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 595.196880] env[59490]: ERROR nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 595.196880] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 595.196880] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 595.196880] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 595.196880] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 595.196880] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 595.196880] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 595.196880] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 595.196880] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 595.196880] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 595.196880] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 595.196880] env[59490]: ERROR nova.compute.manager raise self.value [ 595.196880] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 595.196880] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 595.196880] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 595.196880] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 595.198750] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 595.198750] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 595.198750] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 595.198750] env[59490]: ERROR nova.compute.manager [ 595.198750] env[59490]: Traceback (most recent call last): [ 595.198750] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 595.198750] env[59490]: listener.cb(fileno) [ 595.198750] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 595.198750] env[59490]: result = function(*args, **kwargs) [ 595.198750] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 595.198750] env[59490]: return func(*args, **kwargs) [ 595.198750] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 595.198750] env[59490]: raise e [ 595.198750] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 595.198750] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 595.198750] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 595.198750] env[59490]: created_port_ids = self._update_ports_for_instance( [ 595.198750] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 595.198750] env[59490]: with excutils.save_and_reraise_exception(): [ 595.198750] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 595.198750] env[59490]: self.force_reraise() [ 595.198750] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 595.198750] env[59490]: raise self.value [ 595.198750] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 595.198750] env[59490]: updated_port = self._update_port( [ 595.198750] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 595.198750] env[59490]: _ensure_no_port_binding_failure(port) [ 595.198750] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 595.198750] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 595.201670] env[59490]: nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 595.201670] env[59490]: Removing descriptor: 17 [ 595.201670] env[59490]: ERROR nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Traceback (most recent call last): [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] yield resources [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.driver.spawn(context, instance, image_meta, [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self._vmops.spawn(context, instance, image_meta, injected_files, [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 595.201670] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] vm_ref = self.build_virtual_machine(instance, [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] vif_infos = vmwarevif.get_vif_info(self._session, [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] for vif in network_info: [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self._sync_wrapper(fn, *args, **kwargs) [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.wait() [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self[:] = self._gt.wait() [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self._exit_event.wait() [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 595.206745] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] result = hub.switch() [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self.greenlet.switch() [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] result = function(*args, **kwargs) [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return func(*args, **kwargs) [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise e [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] nwinfo = self.network_api.allocate_for_instance( [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] created_port_ids = self._update_ports_for_instance( [ 595.208338] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] with excutils.save_and_reraise_exception(): [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.force_reraise() [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise self.value [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] updated_port = self._update_port( [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] _ensure_no_port_binding_failure(port) [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise exception.PortBindingFailed(port_id=port['id']) [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 595.210113] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] [ 595.213485] env[59490]: INFO nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Terminating instance [ 595.213485] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquiring lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 595.213485] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquired lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 595.213485] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 595.214936] env[59490]: DEBUG oslo_vmware.rw_handles [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 595.275220] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Successfully created port: aefddd86-8f39-4948-b60a-006a3bad73a6 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 595.282361] env[59490]: DEBUG oslo_vmware.rw_handles [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 595.282543] env[59490]: DEBUG oslo_vmware.rw_handles [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 595.394800] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 596.105458] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Successfully created port: 433824ad-c7d3-4772-a636-c2ff338e1c5e {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 596.230184] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 596.243848] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Releasing lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 596.244275] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 596.244448] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 596.244962] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bdc4dc28-5944-4697-8a35-2aa942b11a67 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.253980] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d662dfe-327a-43cd-8e42-98a0db48a967 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.278824] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance aa569569-2ead-4d30-8416-ea2b3e78c212 could not be found. [ 596.279053] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 596.280886] env[59490]: INFO nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Took 0.03 seconds to destroy the instance on the hypervisor. [ 596.280886] env[59490]: DEBUG oslo.service.loopingcall [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 596.280886] env[59490]: DEBUG nova.compute.manager [-] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 596.280886] env[59490]: DEBUG nova.network.neutron [-] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 596.375313] env[59490]: DEBUG nova.network.neutron [-] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 596.388039] env[59490]: DEBUG nova.network.neutron [-] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 596.397789] env[59490]: INFO nova.compute.manager [-] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Took 0.12 seconds to deallocate network for instance. [ 596.400765] env[59490]: DEBUG nova.compute.claims [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 596.400936] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 596.401159] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 596.562255] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1994861-6bc0-4f0e-a216-80c525788d6f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.569948] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24313910-1248-487a-899e-0fd6b88b9786 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.601459] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ae34c6d-d4d7-4b7d-9d39-a03c2e0c0104 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.609038] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35d42d8b-71fd-4da4-bef7-ae360e03ad36 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 596.623359] env[59490]: DEBUG nova.compute.provider_tree [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 596.636854] env[59490]: DEBUG nova.scheduler.client.report [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 596.656981] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.256s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 596.657623] env[59490]: ERROR nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Traceback (most recent call last): [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.driver.spawn(context, instance, image_meta, [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self._vmops.spawn(context, instance, image_meta, injected_files, [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] vm_ref = self.build_virtual_machine(instance, [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] vif_infos = vmwarevif.get_vif_info(self._session, [ 596.657623] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] for vif in network_info: [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self._sync_wrapper(fn, *args, **kwargs) [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.wait() [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self[:] = self._gt.wait() [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self._exit_event.wait() [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] result = hub.switch() [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return self.greenlet.switch() [ 596.658027] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] result = function(*args, **kwargs) [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] return func(*args, **kwargs) [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise e [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] nwinfo = self.network_api.allocate_for_instance( [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] created_port_ids = self._update_ports_for_instance( [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] with excutils.save_and_reraise_exception(): [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 596.658442] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] self.force_reraise() [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise self.value [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] updated_port = self._update_port( [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] _ensure_no_port_binding_failure(port) [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] raise exception.PortBindingFailed(port_id=port['id']) [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] nova.exception.PortBindingFailed: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. [ 596.658804] env[59490]: ERROR nova.compute.manager [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] [ 596.659101] env[59490]: DEBUG nova.compute.utils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 596.659818] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Build of instance aa569569-2ead-4d30-8416-ea2b3e78c212 was re-scheduled: Binding failed for port f85a54dc-88ec-442e-bba6-b0bd7f51a62f, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 596.660225] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 596.660439] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquiring lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.660575] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Acquired lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.660723] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 596.741265] env[59490]: ERROR nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 596.741265] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 596.741265] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 596.741265] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 596.741265] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 596.741265] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 596.741265] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 596.741265] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 596.741265] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 596.741265] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 596.741265] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 596.741265] env[59490]: ERROR nova.compute.manager raise self.value [ 596.741265] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 596.741265] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 596.741265] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 596.741265] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 596.741693] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 596.741693] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 596.741693] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 596.741693] env[59490]: ERROR nova.compute.manager [ 596.741693] env[59490]: Traceback (most recent call last): [ 596.741693] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 596.741693] env[59490]: listener.cb(fileno) [ 596.741693] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 596.741693] env[59490]: result = function(*args, **kwargs) [ 596.741693] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 596.741693] env[59490]: return func(*args, **kwargs) [ 596.741693] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 596.741693] env[59490]: raise e [ 596.741693] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 596.741693] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 596.741693] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 596.741693] env[59490]: created_port_ids = self._update_ports_for_instance( [ 596.741693] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 596.741693] env[59490]: with excutils.save_and_reraise_exception(): [ 596.741693] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 596.741693] env[59490]: self.force_reraise() [ 596.741693] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 596.741693] env[59490]: raise self.value [ 596.741693] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 596.741693] env[59490]: updated_port = self._update_port( [ 596.741693] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 596.741693] env[59490]: _ensure_no_port_binding_failure(port) [ 596.741693] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 596.741693] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 596.742398] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 596.742398] env[59490]: Removing descriptor: 16 [ 596.742398] env[59490]: ERROR nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Traceback (most recent call last): [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] yield resources [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.driver.spawn(context, instance, image_meta, [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 596.742398] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] vm_ref = self.build_virtual_machine(instance, [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] vif_infos = vmwarevif.get_vif_info(self._session, [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] for vif in network_info: [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self._sync_wrapper(fn, *args, **kwargs) [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.wait() [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self[:] = self._gt.wait() [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self._exit_event.wait() [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 596.742785] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] result = hub.switch() [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self.greenlet.switch() [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] result = function(*args, **kwargs) [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return func(*args, **kwargs) [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise e [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] nwinfo = self.network_api.allocate_for_instance( [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] created_port_ids = self._update_ports_for_instance( [ 596.743136] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] with excutils.save_and_reraise_exception(): [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.force_reraise() [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise self.value [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] updated_port = self._update_port( [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] _ensure_no_port_binding_failure(port) [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise exception.PortBindingFailed(port_id=port['id']) [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 596.743454] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] [ 596.743779] env[59490]: INFO nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Terminating instance [ 596.746546] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 596.746546] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquired lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 596.746546] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 596.755524] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 596.850681] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 597.416571] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.430522] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Releasing lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.431645] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 597.431645] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 597.432060] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d1154b7b-73f6-41fd-99d1-1b67a2f0fe20 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.446990] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5183dc2c-79bd-4650-80bd-e6fbc0904181 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.476383] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.493654] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 67614b5d-b125-4f54-b0e4-4a840a186fe3 could not be found. [ 597.493654] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 597.493654] env[59490]: INFO nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Took 0.06 seconds to destroy the instance on the hypervisor. [ 597.493654] env[59490]: DEBUG oslo.service.loopingcall [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 597.493654] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Releasing lock "refresh_cache-aa569569-2ead-4d30-8416-ea2b3e78c212" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 597.493909] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 597.493986] env[59490]: DEBUG nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 597.495017] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 597.496276] env[59490]: DEBUG nova.compute.manager [-] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 597.496402] env[59490]: DEBUG nova.network.neutron [-] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 597.555030] env[59490]: DEBUG nova.network.neutron [-] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 597.560637] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 597.569408] env[59490]: DEBUG nova.network.neutron [-] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.574440] env[59490]: DEBUG nova.network.neutron [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 597.583427] env[59490]: INFO nova.compute.manager [-] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Took 0.09 seconds to deallocate network for instance. [ 597.585439] env[59490]: DEBUG nova.compute.claims [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 597.585657] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 597.585797] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 597.590850] env[59490]: INFO nova.compute.manager [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] [instance: aa569569-2ead-4d30-8416-ea2b3e78c212] Took 0.09 seconds to deallocate network for instance. [ 597.733189] env[59490]: INFO nova.scheduler.client.report [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Deleted allocations for instance aa569569-2ead-4d30-8416-ea2b3e78c212 [ 597.761693] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5cff17c8-9a70-49bb-955f-3b94031885c0 tempest-FloatingIPsAssociationNegativeTestJSON-27375067 tempest-FloatingIPsAssociationNegativeTestJSON-27375067-project-member] Lock "aa569569-2ead-4d30-8416-ea2b3e78c212" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.827s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 597.812560] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-724a97f2-f3fb-44bb-8b8b-2be39e6cd457 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.820583] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-25195481-f88a-4539-9e8f-37c835b118d5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.852079] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c6a7869-2b9e-4f36-8196-4970c740e99e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.859754] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67ab8e76-9923-4325-9f70-803624c7bd5f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 597.873313] env[59490]: DEBUG nova.compute.provider_tree [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 597.884577] env[59490]: DEBUG nova.scheduler.client.report [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 597.903778] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.317s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 597.904451] env[59490]: ERROR nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Traceback (most recent call last): [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.driver.spawn(context, instance, image_meta, [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] vm_ref = self.build_virtual_machine(instance, [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] vif_infos = vmwarevif.get_vif_info(self._session, [ 597.904451] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] for vif in network_info: [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self._sync_wrapper(fn, *args, **kwargs) [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.wait() [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self[:] = self._gt.wait() [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self._exit_event.wait() [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] result = hub.switch() [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return self.greenlet.switch() [ 597.905873] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] result = function(*args, **kwargs) [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] return func(*args, **kwargs) [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise e [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] nwinfo = self.network_api.allocate_for_instance( [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] created_port_ids = self._update_ports_for_instance( [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] with excutils.save_and_reraise_exception(): [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 597.906347] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] self.force_reraise() [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise self.value [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] updated_port = self._update_port( [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] _ensure_no_port_binding_failure(port) [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] raise exception.PortBindingFailed(port_id=port['id']) [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] nova.exception.PortBindingFailed: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. [ 597.906663] env[59490]: ERROR nova.compute.manager [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] [ 597.906663] env[59490]: DEBUG nova.compute.utils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 597.908303] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Build of instance 67614b5d-b125-4f54-b0e4-4a840a186fe3 was re-scheduled: Binding failed for port 83a1fd10-6eb0-4aeb-837f-b808b1d579bb, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 597.908750] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 597.908967] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquiring lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 597.909130] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Acquired lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 597.909274] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 597.960696] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 598.029914] env[59490]: ERROR nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.029914] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 598.029914] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 598.029914] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 598.029914] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 598.029914] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 598.029914] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 598.029914] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 598.029914] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 598.029914] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 598.029914] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 598.029914] env[59490]: ERROR nova.compute.manager raise self.value [ 598.029914] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 598.029914] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 598.029914] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 598.029914] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 598.030628] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 598.030628] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 598.030628] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.030628] env[59490]: ERROR nova.compute.manager [ 598.030628] env[59490]: Traceback (most recent call last): [ 598.030628] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 598.030628] env[59490]: listener.cb(fileno) [ 598.030628] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 598.030628] env[59490]: result = function(*args, **kwargs) [ 598.030628] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 598.030628] env[59490]: return func(*args, **kwargs) [ 598.030628] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 598.030628] env[59490]: raise e [ 598.030628] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 598.030628] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 598.030628] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 598.030628] env[59490]: created_port_ids = self._update_ports_for_instance( [ 598.030628] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 598.030628] env[59490]: with excutils.save_and_reraise_exception(): [ 598.030628] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 598.030628] env[59490]: self.force_reraise() [ 598.030628] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 598.030628] env[59490]: raise self.value [ 598.030628] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 598.030628] env[59490]: updated_port = self._update_port( [ 598.030628] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 598.030628] env[59490]: _ensure_no_port_binding_failure(port) [ 598.030628] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 598.030628] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 598.031468] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.031468] env[59490]: Removing descriptor: 12 [ 598.031468] env[59490]: ERROR nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Traceback (most recent call last): [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] yield resources [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.driver.spawn(context, instance, image_meta, [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self._vmops.spawn(context, instance, image_meta, injected_files, [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 598.031468] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] vm_ref = self.build_virtual_machine(instance, [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] vif_infos = vmwarevif.get_vif_info(self._session, [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] for vif in network_info: [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self._sync_wrapper(fn, *args, **kwargs) [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.wait() [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self[:] = self._gt.wait() [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self._exit_event.wait() [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 598.031795] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] result = hub.switch() [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self.greenlet.switch() [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] result = function(*args, **kwargs) [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return func(*args, **kwargs) [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise e [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] nwinfo = self.network_api.allocate_for_instance( [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] created_port_ids = self._update_ports_for_instance( [ 598.032208] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] with excutils.save_and_reraise_exception(): [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.force_reraise() [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise self.value [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] updated_port = self._update_port( [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] _ensure_no_port_binding_failure(port) [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise exception.PortBindingFailed(port_id=port['id']) [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.032605] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] [ 598.032960] env[59490]: INFO nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Terminating instance [ 598.036564] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.036723] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquired lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.036877] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 598.117773] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 598.190029] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.213534] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Releasing lock "refresh_cache-67614b5d-b125-4f54-b0e4-4a840a186fe3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 598.213755] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 598.213909] env[59490]: DEBUG nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 598.214077] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 598.293944] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 598.307566] env[59490]: DEBUG nova.network.neutron [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.318023] env[59490]: INFO nova.compute.manager [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] [instance: 67614b5d-b125-4f54-b0e4-4a840a186fe3] Took 0.10 seconds to deallocate network for instance. [ 598.410807] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.426074] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Releasing lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 598.426338] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 598.426459] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 598.427788] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-107938b6-b51d-40b1-9437-63c3a999ea7a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.439984] env[59490]: INFO nova.scheduler.client.report [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Deleted allocations for instance 67614b5d-b125-4f54-b0e4-4a840a186fe3 [ 598.451596] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a278543-8ca9-4231-8f9c-142390b22243 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.479849] env[59490]: DEBUG oslo_concurrency.lockutils [None req-464be094-dd79-4453-82eb-27f04e8dd126 tempest-MigrationsAdminTest-1715353959 tempest-MigrationsAdminTest-1715353959-project-member] Lock "67614b5d-b125-4f54-b0e4-4a840a186fe3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.831s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.482464] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance de330988-9a6c-43ff-a8ab-25bc5c5a7a51 could not be found. [ 598.482464] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 598.482464] env[59490]: INFO nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Took 0.05 seconds to destroy the instance on the hypervisor. [ 598.482464] env[59490]: DEBUG oslo.service.loopingcall [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 598.482464] env[59490]: DEBUG nova.compute.manager [-] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 598.482618] env[59490]: DEBUG nova.network.neutron [-] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 598.530082] env[59490]: DEBUG nova.network.neutron [-] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 598.550888] env[59490]: DEBUG nova.network.neutron [-] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 598.566563] env[59490]: INFO nova.compute.manager [-] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Took 0.08 seconds to deallocate network for instance. [ 598.568753] env[59490]: DEBUG nova.compute.claims [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 598.568753] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 598.568968] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 598.709750] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-826d893c-ee25-42d4-a5e6-3dc8d9e65ee2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.719650] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd176848-30d6-4a48-9994-028a3148931f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.757858] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d221d238-8424-4158-89c5-6dcaec8023f2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.768773] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3c62348f-fea4-4a74-84e1-658b34590ba0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 598.785788] env[59490]: DEBUG nova.compute.provider_tree [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 598.794397] env[59490]: DEBUG nova.scheduler.client.report [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 598.813661] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.243s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 598.813661] env[59490]: ERROR nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Traceback (most recent call last): [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.driver.spawn(context, instance, image_meta, [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self._vmops.spawn(context, instance, image_meta, injected_files, [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 598.813661] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] vm_ref = self.build_virtual_machine(instance, [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] vif_infos = vmwarevif.get_vif_info(self._session, [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] for vif in network_info: [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self._sync_wrapper(fn, *args, **kwargs) [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.wait() [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self[:] = self._gt.wait() [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self._exit_event.wait() [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 598.813980] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] result = hub.switch() [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return self.greenlet.switch() [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] result = function(*args, **kwargs) [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] return func(*args, **kwargs) [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise e [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] nwinfo = self.network_api.allocate_for_instance( [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] created_port_ids = self._update_ports_for_instance( [ 598.814375] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] with excutils.save_and_reraise_exception(): [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] self.force_reraise() [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise self.value [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] updated_port = self._update_port( [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] _ensure_no_port_binding_failure(port) [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] raise exception.PortBindingFailed(port_id=port['id']) [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] nova.exception.PortBindingFailed: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. [ 598.814717] env[59490]: ERROR nova.compute.manager [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] [ 598.815224] env[59490]: DEBUG nova.compute.utils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 598.815707] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Build of instance de330988-9a6c-43ff-a8ab-25bc5c5a7a51 was re-scheduled: Binding failed for port 9f60bb7e-5421-4ff8-8830-8253b023fe57, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 598.816200] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 598.816484] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquiring lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 598.816666] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Acquired lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 598.817094] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 598.863600] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 599.070158] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.081024] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Releasing lock "refresh_cache-de330988-9a6c-43ff-a8ab-25bc5c5a7a51" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 599.081024] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 599.081024] env[59490]: DEBUG nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 599.081024] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 599.135796] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 599.145565] env[59490]: DEBUG nova.network.neutron [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 599.158734] env[59490]: INFO nova.compute.manager [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] [instance: de330988-9a6c-43ff-a8ab-25bc5c5a7a51] Took 0.08 seconds to deallocate network for instance. [ 599.269969] env[59490]: INFO nova.scheduler.client.report [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Deleted allocations for instance de330988-9a6c-43ff-a8ab-25bc5c5a7a51 [ 599.300689] env[59490]: DEBUG oslo_concurrency.lockutils [None req-73bd5f0d-b9bb-4736-b99b-f1569c8b3b46 tempest-DeleteServersAdminTestJSON-1601077900 tempest-DeleteServersAdminTestJSON-1601077900-project-member] Lock "de330988-9a6c-43ff-a8ab-25bc5c5a7a51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 11.369s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.090251] env[59490]: ERROR nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.090251] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 600.090251] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.090251] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 600.090251] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.090251] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 600.090251] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.090251] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 600.090251] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.090251] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 600.090251] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.090251] env[59490]: ERROR nova.compute.manager raise self.value [ 600.090251] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.090251] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 600.090251] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.090251] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 600.090885] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.090885] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 600.090885] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.090885] env[59490]: ERROR nova.compute.manager [ 600.090885] env[59490]: Traceback (most recent call last): [ 600.090885] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 600.090885] env[59490]: listener.cb(fileno) [ 600.090885] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 600.090885] env[59490]: result = function(*args, **kwargs) [ 600.090885] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 600.090885] env[59490]: return func(*args, **kwargs) [ 600.090885] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 600.090885] env[59490]: raise e [ 600.090885] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.090885] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 600.090885] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.090885] env[59490]: created_port_ids = self._update_ports_for_instance( [ 600.090885] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.090885] env[59490]: with excutils.save_and_reraise_exception(): [ 600.090885] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.090885] env[59490]: self.force_reraise() [ 600.090885] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.090885] env[59490]: raise self.value [ 600.090885] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.090885] env[59490]: updated_port = self._update_port( [ 600.090885] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.090885] env[59490]: _ensure_no_port_binding_failure(port) [ 600.090885] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.090885] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 600.091594] env[59490]: nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.091594] env[59490]: Removing descriptor: 20 [ 600.091594] env[59490]: ERROR nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Traceback (most recent call last): [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] yield resources [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.driver.spawn(context, instance, image_meta, [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 600.091594] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] vm_ref = self.build_virtual_machine(instance, [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] vif_infos = vmwarevif.get_vif_info(self._session, [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] for vif in network_info: [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self._sync_wrapper(fn, *args, **kwargs) [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.wait() [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self[:] = self._gt.wait() [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self._exit_event.wait() [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 600.091966] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] result = hub.switch() [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self.greenlet.switch() [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] result = function(*args, **kwargs) [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return func(*args, **kwargs) [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise e [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] nwinfo = self.network_api.allocate_for_instance( [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] created_port_ids = self._update_ports_for_instance( [ 600.092335] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] with excutils.save_and_reraise_exception(): [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.force_reraise() [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise self.value [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] updated_port = self._update_port( [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] _ensure_no_port_binding_failure(port) [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise exception.PortBindingFailed(port_id=port['id']) [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.092670] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] [ 600.093047] env[59490]: INFO nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Terminating instance [ 600.094402] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquiring lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.094670] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquired lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.094858] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 600.139714] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 600.409912] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.421735] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Releasing lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 600.422151] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 600.422339] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 600.422832] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-36b79e37-0d1e-4eff-8a66-a1611a648b48 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.432709] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-006f2d6c-66bd-42f2-8f5e-2ea20d087dea {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.457866] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 760b4e7a-17ed-45c7-a7df-5698c9a358b6 could not be found. [ 600.458212] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 600.458408] env[59490]: INFO nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 600.458669] env[59490]: DEBUG oslo.service.loopingcall [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 600.458896] env[59490]: DEBUG nova.compute.manager [-] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 600.458988] env[59490]: DEBUG nova.network.neutron [-] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 600.490960] env[59490]: DEBUG nova.compute.manager [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Received event network-changed-433824ad-c7d3-4772-a636-c2ff338e1c5e {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 600.491168] env[59490]: DEBUG nova.compute.manager [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Refreshing instance network info cache due to event network-changed-433824ad-c7d3-4772-a636-c2ff338e1c5e. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 600.491383] env[59490]: DEBUG oslo_concurrency.lockutils [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] Acquiring lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.491638] env[59490]: DEBUG oslo_concurrency.lockutils [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] Acquired lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.491912] env[59490]: DEBUG nova.network.neutron [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Refreshing network info cache for port 433824ad-c7d3-4772-a636-c2ff338e1c5e {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 600.560321] env[59490]: DEBUG nova.network.neutron [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 600.641307] env[59490]: DEBUG nova.network.neutron [-] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 600.659366] env[59490]: DEBUG nova.network.neutron [-] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 600.677316] env[59490]: INFO nova.compute.manager [-] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Took 0.22 seconds to deallocate network for instance. [ 600.679432] env[59490]: DEBUG nova.compute.claims [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 600.681897] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 600.682674] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.003s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 600.838087] env[59490]: ERROR nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Instance failed network setup after 1 attempt(s): nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 600.838087] env[59490]: ERROR nova.compute.manager Traceback (most recent call last): [ 600.838087] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.838087] env[59490]: ERROR nova.compute.manager nwinfo = self.network_api.allocate_for_instance( [ 600.838087] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.838087] env[59490]: ERROR nova.compute.manager created_port_ids = self._update_ports_for_instance( [ 600.838087] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.838087] env[59490]: ERROR nova.compute.manager with excutils.save_and_reraise_exception(): [ 600.838087] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.838087] env[59490]: ERROR nova.compute.manager self.force_reraise() [ 600.838087] env[59490]: ERROR nova.compute.manager File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.838087] env[59490]: ERROR nova.compute.manager raise self.value [ 600.838087] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.838087] env[59490]: ERROR nova.compute.manager updated_port = self._update_port( [ 600.838087] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.838087] env[59490]: ERROR nova.compute.manager _ensure_no_port_binding_failure(port) [ 600.838620] env[59490]: ERROR nova.compute.manager File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.838620] env[59490]: ERROR nova.compute.manager raise exception.PortBindingFailed(port_id=port['id']) [ 600.838620] env[59490]: ERROR nova.compute.manager nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 600.838620] env[59490]: ERROR nova.compute.manager [ 600.838620] env[59490]: Traceback (most recent call last): [ 600.838620] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/poll.py", line 111, in wait [ 600.838620] env[59490]: listener.cb(fileno) [ 600.838620] env[59490]: File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 600.838620] env[59490]: result = function(*args, **kwargs) [ 600.838620] env[59490]: File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 600.838620] env[59490]: return func(*args, **kwargs) [ 600.838620] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 600.838620] env[59490]: raise e [ 600.838620] env[59490]: File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.838620] env[59490]: nwinfo = self.network_api.allocate_for_instance( [ 600.838620] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.838620] env[59490]: created_port_ids = self._update_ports_for_instance( [ 600.838620] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.838620] env[59490]: with excutils.save_and_reraise_exception(): [ 600.838620] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.838620] env[59490]: self.force_reraise() [ 600.838620] env[59490]: File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.838620] env[59490]: raise self.value [ 600.838620] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.838620] env[59490]: updated_port = self._update_port( [ 600.838620] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.838620] env[59490]: _ensure_no_port_binding_failure(port) [ 600.838620] env[59490]: File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.838620] env[59490]: raise exception.PortBindingFailed(port_id=port['id']) [ 600.839438] env[59490]: nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 600.839438] env[59490]: Removing descriptor: 19 [ 600.839438] env[59490]: ERROR nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance failed to spawn: nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Traceback (most recent call last): [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] yield resources [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.driver.spawn(context, instance, image_meta, [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self._vmops.spawn(context, instance, image_meta, injected_files, [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 600.839438] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] vm_ref = self.build_virtual_machine(instance, [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] vif_infos = vmwarevif.get_vif_info(self._session, [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] for vif in network_info: [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self._sync_wrapper(fn, *args, **kwargs) [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.wait() [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self[:] = self._gt.wait() [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self._exit_event.wait() [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 600.839888] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] result = hub.switch() [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self.greenlet.switch() [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] result = function(*args, **kwargs) [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return func(*args, **kwargs) [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise e [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] nwinfo = self.network_api.allocate_for_instance( [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] created_port_ids = self._update_ports_for_instance( [ 600.840321] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] with excutils.save_and_reraise_exception(): [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.force_reraise() [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise self.value [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] updated_port = self._update_port( [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] _ensure_no_port_binding_failure(port) [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise exception.PortBindingFailed(port_id=port['id']) [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 600.842508] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] [ 600.842920] env[59490]: INFO nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Terminating instance [ 600.845031] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd0a1be9-1c13-44a4-8ebe-2a5332f8593c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.848695] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquiring lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.848879] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquired lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 600.849079] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 600.859562] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e51d573-6019-4622-97b3-5999a3acedf7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.901396] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a73eff4b-b6cf-4dd9-a72a-cae3c86d34fa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.912023] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a14fce87-ae45-42b0-97c6-bab5c2258c01 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 600.927505] env[59490]: DEBUG nova.compute.provider_tree [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 600.936940] env[59490]: DEBUG nova.scheduler.client.report [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 600.959470] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.275s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 600.960535] env[59490]: ERROR nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Traceback (most recent call last): [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.driver.spawn(context, instance, image_meta, [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] vm_ref = self.build_virtual_machine(instance, [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] vif_infos = vmwarevif.get_vif_info(self._session, [ 600.960535] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] for vif in network_info: [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self._sync_wrapper(fn, *args, **kwargs) [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.wait() [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self[:] = self._gt.wait() [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self._exit_event.wait() [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] result = hub.switch() [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return self.greenlet.switch() [ 600.960926] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] result = function(*args, **kwargs) [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] return func(*args, **kwargs) [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise e [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] nwinfo = self.network_api.allocate_for_instance( [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] created_port_ids = self._update_ports_for_instance( [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] with excutils.save_and_reraise_exception(): [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 600.961353] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] self.force_reraise() [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise self.value [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] updated_port = self._update_port( [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] _ensure_no_port_binding_failure(port) [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] raise exception.PortBindingFailed(port_id=port['id']) [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] nova.exception.PortBindingFailed: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. [ 600.961732] env[59490]: ERROR nova.compute.manager [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] [ 600.965019] env[59490]: DEBUG nova.compute.utils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 600.967924] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Build of instance 760b4e7a-17ed-45c7-a7df-5698c9a358b6 was re-scheduled: Binding failed for port 433824ad-c7d3-4772-a636-c2ff338e1c5e, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 600.968402] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 600.968596] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquiring lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 600.986646] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 601.233610] env[59490]: DEBUG nova.network.neutron [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.247349] env[59490]: DEBUG oslo_concurrency.lockutils [req-53502655-01b0-4ce7-913f-c9f1d3bd8f5d req-86f60a31-413c-4fd9-900f-353ac2cc2105 service nova] Releasing lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.247349] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Acquired lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 601.249388] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 601.330522] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 601.461477] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.477802] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Releasing lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.477802] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 601.478704] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 601.479297] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6f771cc3-5223-4b78-a4b0-b6fc7923f9c3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.490826] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca715a46-042b-46c4-baa9-a2ac267b3f26 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.522292] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 84594817-90c9-4c87-b856-d0340b0d4972 could not be found. [ 601.522552] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 601.522731] env[59490]: INFO nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Took 0.04 seconds to destroy the instance on the hypervisor. [ 601.522980] env[59490]: DEBUG oslo.service.loopingcall [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 601.528298] env[59490]: DEBUG nova.compute.manager [-] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 601.528298] env[59490]: DEBUG nova.network.neutron [-] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 601.591988] env[59490]: DEBUG nova.network.neutron [-] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 601.605087] env[59490]: DEBUG nova.network.neutron [-] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.616995] env[59490]: INFO nova.compute.manager [-] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Took 0.09 seconds to deallocate network for instance. [ 601.619784] env[59490]: DEBUG nova.compute.claims [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 601.619959] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 601.620280] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 601.788865] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526433d7-4db9-46b7-834b-f1aa4d91e21b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.799717] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0305cf42-8ecd-413f-975a-14a2f4413607 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.833300] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.834853] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26ce94d7-9469-4214-abef-52bde8d228eb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.845255] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9eed8958-fbb9-4db9-a5d3-b1cc5fa46a88 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 601.851392] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Releasing lock "refresh_cache-760b4e7a-17ed-45c7-a7df-5698c9a358b6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 601.851392] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 601.851392] env[59490]: DEBUG nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 601.851392] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 601.863605] env[59490]: DEBUG nova.compute.provider_tree [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 601.873651] env[59490]: DEBUG nova.scheduler.client.report [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 601.892544] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.271s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 601.892544] env[59490]: ERROR nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Failed to build and run instance: nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Traceback (most recent call last): [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.driver.spawn(context, instance, image_meta, [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self._vmops.spawn(context, instance, image_meta, injected_files, [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 753, in spawn [ 601.892544] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] vm_ref = self.build_virtual_machine(instance, [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 275, in build_virtual_machine [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] vif_infos = vmwarevif.get_vif_info(self._session, [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/virt/vmwareapi/vif.py", line 119, in get_vif_info [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] for vif in network_info: [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 612, in __iter__ [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self._sync_wrapper(fn, *args, **kwargs) [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 603, in _sync_wrapper [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.wait() [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/model.py", line 635, in wait [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self[:] = self._gt.wait() [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 181, in wait [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self._exit_event.wait() [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 601.892889] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] result = hub.switch() [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return self.greenlet.switch() [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/eventlet/greenthread.py", line 221, in main [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] result = function(*args, **kwargs) [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/utils.py", line 654, in context_wrapper [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] return func(*args, **kwargs) [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 1982, in _allocate_network_async [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise e [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/compute/manager.py", line 1960, in _allocate_network_async [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] nwinfo = self.network_api.allocate_for_instance( [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1229, in allocate_for_instance [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] created_port_ids = self._update_ports_for_instance( [ 601.894616] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1365, in _update_ports_for_instance [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] with excutils.save_and_reraise_exception(): [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] self.force_reraise() [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise self.value [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 1340, in _update_ports_for_instance [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] updated_port = self._update_port( [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 585, in _update_port [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] _ensure_no_port_binding_failure(port) [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] File "/opt/stack/nova/nova/network/neutron.py", line 294, in _ensure_no_port_binding_failure [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] raise exception.PortBindingFailed(port_id=port['id']) [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] nova.exception.PortBindingFailed: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. [ 601.894994] env[59490]: ERROR nova.compute.manager [instance: 84594817-90c9-4c87-b856-d0340b0d4972] [ 601.895341] env[59490]: DEBUG nova.compute.utils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 601.895970] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Build of instance 84594817-90c9-4c87-b856-d0340b0d4972 was re-scheduled: Binding failed for port aefddd86-8f39-4948-b60a-006a3bad73a6, please check neutron logs for more information. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 601.896118] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 601.896358] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquiring lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 601.896502] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Acquired lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 601.896655] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 601.900719] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 601.910899] env[59490]: DEBUG nova.network.neutron [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 601.923096] env[59490]: INFO nova.compute.manager [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] [instance: 760b4e7a-17ed-45c7-a7df-5698c9a358b6] Took 0.07 seconds to deallocate network for instance. [ 601.966820] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 602.048078] env[59490]: INFO nova.scheduler.client.report [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Deleted allocations for instance 760b4e7a-17ed-45c7-a7df-5698c9a358b6 [ 602.075609] env[59490]: DEBUG oslo_concurrency.lockutils [None req-a66b45a6-cfb2-4f72-8be9-d8fd53e6b4d1 tempest-ServersWithSpecificFlavorTestJSON-111624353 tempest-ServersWithSpecificFlavorTestJSON-111624353-project-member] Lock "760b4e7a-17ed-45c7-a7df-5698c9a358b6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 7.893s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 602.411041] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 602.425790] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Releasing lock "refresh_cache-84594817-90c9-4c87-b856-d0340b0d4972" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 602.426140] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 602.426296] env[59490]: DEBUG nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 602.427148] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 602.546149] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 602.562853] env[59490]: DEBUG nova.network.neutron [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 602.579280] env[59490]: INFO nova.compute.manager [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] [instance: 84594817-90c9-4c87-b856-d0340b0d4972] Took 0.15 seconds to deallocate network for instance. [ 602.700180] env[59490]: INFO nova.scheduler.client.report [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Deleted allocations for instance 84594817-90c9-4c87-b856-d0340b0d4972 [ 602.739673] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1f49035a-f140-45b1-b3b8-68624e4cd918 tempest-ServersAdminNegativeTestJSON-420877850 tempest-ServersAdminNegativeTestJSON-420877850-project-member] Lock "84594817-90c9-4c87-b856-d0340b0d4972" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 10.742s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 619.349318] env[59490]: WARNING oslo_vmware.rw_handles [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 619.349318] env[59490]: ERROR oslo_vmware.rw_handles [ 619.349931] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 619.351073] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 619.352287] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Copying Virtual Disk [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/44d0899c-af22-40f9-b422-f5128ac29a70/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 619.352652] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0626ab19-d380-4940-9b1d-ed49aa0d35ba {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.361412] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for the task: (returnval){ [ 619.361412] env[59490]: value = "task-707366" [ 619.361412] env[59490]: _type = "Task" [ 619.361412] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.372388] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Task: {'id': task-707366, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.875476] env[59490]: DEBUG oslo_vmware.exceptions [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 619.876055] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 619.876873] env[59490]: ERROR nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.876873] env[59490]: Faults: ['InvalidArgument'] [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] Traceback (most recent call last): [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] yield resources [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self.driver.spawn(context, instance, image_meta, [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self._vmops.spawn(context, instance, image_meta, injected_files, [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self._fetch_image_if_missing(context, vi) [ 619.876873] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] image_cache(vi, tmp_image_ds_loc) [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] vm_util.copy_virtual_disk( [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] session._wait_for_task(vmdk_copy_task) [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return self.wait_for_task(task_ref) [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return evt.wait() [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] result = hub.switch() [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 619.880366] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return self.greenlet.switch() [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self.f(*self.args, **self.kw) [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] raise exceptions.translate_fault(task_info.error) [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] Faults: ['InvalidArgument'] [ 619.880745] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] [ 619.880745] env[59490]: INFO nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Terminating instance [ 619.880745] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.881013] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 619.881013] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-53b37eac-626c-4cff-9293-995da1b0c30c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.885475] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 619.885475] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquired lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 619.885475] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 619.891788] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 619.891956] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 619.893303] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-06ef35cb-acd2-44f3-8838-091c5e62646e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 619.902175] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for the task: (returnval){ [ 619.902175] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5284ffe2-d145-3dbd-34e4-761c73831373" [ 619.902175] env[59490]: _type = "Task" [ 619.902175] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 619.919833] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5284ffe2-d145-3dbd-34e4-761c73831373, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 619.976815] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 620.420650] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 620.420991] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating directory with path [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 620.421381] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fd6c96c5-6e6e-481f-9349-d3d5c3a6f6f1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.443338] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Created directory with path [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 620.444511] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Fetch image to [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 620.447832] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 620.447832] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d7d0f71d-fffc-4c6e-9ea1-e623752dfa36 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.456301] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6b6b410-240f-4fb3-b9e7-3e7ebbb4225c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.469694] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e0c626b-8c21-41fc-b020-85ac50dde9ad {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.509339] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b166a080-19d5-45d0-a5cd-2f886b64306a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.514666] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-61997f41-40d9-4c56-8c40-7af05e9fb935 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.543658] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 620.635062] env[59490]: DEBUG oslo_vmware.rw_handles [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 620.698588] env[59490]: DEBUG oslo_vmware.rw_handles [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 620.698738] env[59490]: DEBUG oslo_vmware.rw_handles [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 620.726271] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 620.736980] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Releasing lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 620.737413] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 620.737608] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 620.738710] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e42a9fa9-4c6c-40db-809e-97cbe7adba63 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.749782] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 620.750087] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8050b1b8-3ed0-4bef-abb5-c3d94e1f42ea {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.784807] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 620.784807] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 620.785017] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Deleting the datastore file [datastore2] 1568985c-6898-4b06-817e-f0354a903771 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 620.785380] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7e63c4c1-f6cb-43dc-99f1-17dc225f54ca {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 620.792036] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for the task: (returnval){ [ 620.792036] env[59490]: value = "task-707368" [ 620.792036] env[59490]: _type = "Task" [ 620.792036] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 620.799993] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Task: {'id': task-707368, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 621.310818] env[59490]: DEBUG oslo_vmware.api [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Task: {'id': task-707368, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.035491} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 621.310818] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 621.310818] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 621.310818] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 621.310818] env[59490]: INFO nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Took 0.57 seconds to destroy the instance on the hypervisor. [ 621.311072] env[59490]: DEBUG oslo.service.loopingcall [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 621.311072] env[59490]: DEBUG nova.compute.manager [-] [instance: 1568985c-6898-4b06-817e-f0354a903771] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 621.315402] env[59490]: DEBUG nova.compute.claims [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 621.315480] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 621.315948] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 621.445934] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92927e04-dcf2-4ada-baec-c8bb327c702b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.454723] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ce0ca0f-2cea-41a0-90b0-69c34e2b1326 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.497269] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-365c13f7-6a8d-4927-ac5c-7582bf8120da {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.505323] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1efb5eb-66f2-44b8-a141-74feea452141 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 621.519408] env[59490]: DEBUG nova.compute.provider_tree [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 621.532020] env[59490]: DEBUG nova.scheduler.client.report [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 621.550243] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.234s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 621.552111] env[59490]: ERROR nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 621.552111] env[59490]: Faults: ['InvalidArgument'] [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] Traceback (most recent call last): [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self.driver.spawn(context, instance, image_meta, [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self._vmops.spawn(context, instance, image_meta, injected_files, [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self._fetch_image_if_missing(context, vi) [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] image_cache(vi, tmp_image_ds_loc) [ 621.552111] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] vm_util.copy_virtual_disk( [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] session._wait_for_task(vmdk_copy_task) [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return self.wait_for_task(task_ref) [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return evt.wait() [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] result = hub.switch() [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] return self.greenlet.switch() [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 621.552656] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] self.f(*self.args, **self.kw) [ 621.553068] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 621.553068] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] raise exceptions.translate_fault(task_info.error) [ 621.553068] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 621.553068] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] Faults: ['InvalidArgument'] [ 621.553068] env[59490]: ERROR nova.compute.manager [instance: 1568985c-6898-4b06-817e-f0354a903771] [ 621.553068] env[59490]: DEBUG nova.compute.utils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 621.555419] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Build of instance 1568985c-6898-4b06-817e-f0354a903771 was re-scheduled: A specified parameter was not correct: fileType [ 621.555419] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 621.555965] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 621.556351] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquiring lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 621.556911] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Acquired lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 621.557183] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 621.665178] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 622.459979] env[59490]: DEBUG nova.network.neutron [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 622.474919] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Releasing lock "refresh_cache-1568985c-6898-4b06-817e-f0354a903771" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 622.475218] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 622.475329] env[59490]: DEBUG nova.compute.manager [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] [instance: 1568985c-6898-4b06-817e-f0354a903771] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 622.585870] env[59490]: INFO nova.scheduler.client.report [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Deleted allocations for instance 1568985c-6898-4b06-817e-f0354a903771 [ 622.609183] env[59490]: DEBUG oslo_concurrency.lockutils [None req-2fbb5131-7634-450a-bcea-0c2c1481b919 tempest-ServerDiagnosticsV248Test-897504850 tempest-ServerDiagnosticsV248Test-897504850-project-member] Lock "1568985c-6898-4b06-817e-f0354a903771" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 53.223s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 622.609730] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "1568985c-6898-4b06-817e-f0354a903771" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 41.993s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 622.609730] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 1568985c-6898-4b06-817e-f0354a903771] During sync_power_state the instance has a pending task (spawning). Skip. [ 622.609730] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "1568985c-6898-4b06-817e-f0354a903771" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 641.398865] env[59490]: WARNING oslo_vmware.rw_handles [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 641.398865] env[59490]: ERROR oslo_vmware.rw_handles [ 641.398865] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore1 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 641.399706] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 641.399706] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Copying Virtual Disk [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore1] vmware_temp/7720cba0-f437-4999-9621-0c7557108b74/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 641.400918] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b22d0a44-fac5-4571-a470-b69e73fe7f82 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 641.409661] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for the task: (returnval){ [ 641.409661] env[59490]: value = "task-707369" [ 641.409661] env[59490]: _type = "Task" [ 641.409661] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 641.419557] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Task: {'id': task-707369, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 641.926046] env[59490]: DEBUG oslo_vmware.exceptions [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 641.926388] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Releasing lock "[datastore1] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 641.926977] env[59490]: ERROR nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 641.926977] env[59490]: Faults: ['InvalidArgument'] [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Traceback (most recent call last): [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] yield resources [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self.driver.spawn(context, instance, image_meta, [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self._fetch_image_if_missing(context, vi) [ 641.926977] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] image_cache(vi, tmp_image_ds_loc) [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] vm_util.copy_virtual_disk( [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] session._wait_for_task(vmdk_copy_task) [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return self.wait_for_task(task_ref) [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return evt.wait() [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] result = hub.switch() [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 641.927467] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return self.greenlet.switch() [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self.f(*self.args, **self.kw) [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] raise exceptions.translate_fault(task_info.error) [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Faults: ['InvalidArgument'] [ 641.927873] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] [ 641.927873] env[59490]: INFO nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Terminating instance [ 641.933528] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 641.933646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquired lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 641.933804] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 642.044945] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 642.450258] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 642.462730] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Releasing lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 642.463230] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 642.464503] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 642.464621] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f6e71c4-ed1d-43c5-ab5e-5c1708eaca54 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.474010] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 642.476612] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d52992c3-f751-4c3e-816f-d15a38f62d6b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.511511] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 642.511511] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Deleting contents of the VM from datastore datastore1 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 642.511511] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Deleting the datastore file [datastore1] 31c074c3-93cf-4f48-b003-253fc5405e35 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 642.511511] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4912a99a-80a0-4473-a102-4711b94934c9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 642.521629] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for the task: (returnval){ [ 642.521629] env[59490]: value = "task-707371" [ 642.521629] env[59490]: _type = "Task" [ 642.521629] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 642.534269] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Task: {'id': task-707371, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 643.035722] env[59490]: DEBUG oslo_vmware.api [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Task: {'id': task-707371, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.040838} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 643.035969] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 643.036163] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Deleted contents of the VM from datastore datastore1 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 643.036346] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 643.036515] env[59490]: INFO nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Took 0.57 seconds to destroy the instance on the hypervisor. [ 643.036738] env[59490]: DEBUG oslo.service.loopingcall [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 643.036919] env[59490]: DEBUG nova.compute.manager [-] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 643.039248] env[59490]: DEBUG nova.compute.claims [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 643.039400] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 643.039598] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 643.133745] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-835e472b-78dc-4405-8d5e-9d5ebe9904a8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.142485] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bc90af5-95e4-448b-9273-06911d5bd741 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.196749] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd248ac8-eb58-444e-993f-cee33bb62ee8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.196749] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6ff4796-96a2-4b0f-b34d-c24a29ba6319 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 643.196749] env[59490]: DEBUG nova.compute.provider_tree [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 643.206798] env[59490]: DEBUG nova.scheduler.client.report [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 643.226034] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.186s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 643.226586] env[59490]: ERROR nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 643.226586] env[59490]: Faults: ['InvalidArgument'] [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Traceback (most recent call last): [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self.driver.spawn(context, instance, image_meta, [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self._vmops.spawn(context, instance, image_meta, injected_files, [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self._fetch_image_if_missing(context, vi) [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] image_cache(vi, tmp_image_ds_loc) [ 643.226586] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] vm_util.copy_virtual_disk( [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] session._wait_for_task(vmdk_copy_task) [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return self.wait_for_task(task_ref) [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return evt.wait() [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] result = hub.switch() [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] return self.greenlet.switch() [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 643.226914] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] self.f(*self.args, **self.kw) [ 643.227243] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 643.227243] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] raise exceptions.translate_fault(task_info.error) [ 643.227243] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 643.227243] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Faults: ['InvalidArgument'] [ 643.227243] env[59490]: ERROR nova.compute.manager [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] [ 643.227417] env[59490]: DEBUG nova.compute.utils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 643.229268] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Build of instance 31c074c3-93cf-4f48-b003-253fc5405e35 was re-scheduled: A specified parameter was not correct: fileType [ 643.229268] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 643.229583] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 643.229812] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquiring lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 643.229953] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Acquired lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 643.230128] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 643.351222] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 643.638543] env[59490]: DEBUG nova.network.neutron [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 643.647437] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Releasing lock "refresh_cache-31c074c3-93cf-4f48-b003-253fc5405e35" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 643.648592] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 643.648973] env[59490]: DEBUG nova.compute.manager [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] [instance: 31c074c3-93cf-4f48-b003-253fc5405e35] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 643.748204] env[59490]: INFO nova.scheduler.client.report [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Deleted allocations for instance 31c074c3-93cf-4f48-b003-253fc5405e35 [ 643.787559] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fd688eae-b5d8-4d0d-aab5-9c1c8cfd76cd tempest-ServersAaction247Test-277042623 tempest-ServersAaction247Test-277042623-project-member] Lock "31c074c3-93cf-4f48-b003-253fc5405e35" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.152s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 645.742746] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 645.766212] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 645.766212] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 645.766212] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 645.766212] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 646.384259] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 646.384479] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 646.384628] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 646.403467] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.403709] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.403992] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 646.403992] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 646.405114] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01c40a6a-fbc6-4356-94ff-9b9504ef3ed3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.415463] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72878484-3024-4fe4-82ba-0f7a9eb9486f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.431993] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34cd3a02-789b-4e56-b11d-ab203a92f383 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.439496] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a3bee94-5d3e-495c-a11c-cb8151e3c460 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.473223] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181634MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 646.473412] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 646.473971] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 646.552195] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e9f81c59-44ea-4276-a310-7581e3a7abb1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 646.552387] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 646.552522] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 646.584091] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dabaa64b-11ff-4ef7-a03b-4afdd473d301 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.596154] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72f5bf45-2b8d-45a0-8a49-2f449a91a2a8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.629922] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b89fa82-b8da-4fe5-8c31-91271ca61695 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.641025] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-558ae88a-59db-4f34-b6eb-62c8c11e74ce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 646.661737] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 646.671074] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 646.687255] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 646.687450] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.214s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 647.683126] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.683126] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 647.683126] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 647.683126] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 647.698399] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 647.698561] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 647.699021] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 655.598467] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.598840] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.616854] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 655.688327] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 655.688604] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 655.690544] env[59490]: INFO nova.compute.claims [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 655.862628] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b28ea843-98d3-4fc2-bdbf-af6e1fae8c6f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.873987] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fdfd633e-2eef-4594-83a9-7f9dfecdabb8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.913405] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c3d058c-edf6-426f-95de-faf8a59a7a87 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.920941] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4c4c0f72-a8d5-42b4-b9e0-fec39ee9e3a8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 655.933912] env[59490]: DEBUG nova.compute.provider_tree [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 655.944043] env[59490]: DEBUG nova.scheduler.client.report [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 655.961377] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.272s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 655.961491] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 656.000962] env[59490]: DEBUG nova.compute.utils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 656.001936] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 656.002129] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 656.017319] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 656.102021] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 656.125577] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 656.127111] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 656.127111] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 656.127111] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 656.127111] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 656.127111] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 656.127425] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 656.127907] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 656.128395] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 656.128713] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 656.131019] env[59490]: DEBUG nova.virt.hardware [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 656.131019] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-729c41eb-6cb7-4762-92f1-1f7e47a10604 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.142188] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb1bd91d-98ce-404b-a5f0-82e1c1014721 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 656.581948] env[59490]: DEBUG nova.policy [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '698f4bafd7d44a8ab78e7c743666a58f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '88b5a27edd5f4d9796aa911867721f9d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 657.461741] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "08923fae-e356-444d-b221-b40576b54af9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.464799] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "08923fae-e356-444d-b221-b40576b54af9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.480993] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 657.550354] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 657.550641] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 657.552445] env[59490]: INFO nova.compute.claims [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 657.672685] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec432d2f-7308-4416-ab79-e1cfc85a15d4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.681116] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29bf674b-9898-4c8f-98e5-0757875fe86d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.719533] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a5701fa-7c67-496a-babe-23e17107b13f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.729017] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f7389a4-360b-4618-b0e0-b9d26c76a846 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.744960] env[59490]: DEBUG nova.compute.provider_tree [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 657.759341] env[59490]: DEBUG nova.scheduler.client.report [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 657.778565] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.228s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 657.779032] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 657.821574] env[59490]: DEBUG nova.compute.utils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 657.825822] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 657.825822] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 657.834231] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 657.926881] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 657.950463] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 657.950747] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 657.950912] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 657.951109] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 657.951253] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 657.951397] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 657.951596] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 657.951746] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 657.951907] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 657.952075] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 657.952243] env[59490]: DEBUG nova.virt.hardware [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 657.953175] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83b95e70-d895-428a-8e21-b6e131b19611 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 657.962016] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a438082f-3c08-4b96-a8f5-aa844bfbbd96 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 658.198122] env[59490]: DEBUG nova.policy [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '59edea3a90eb45c28fbb5ceb426d0629', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '312c91f87af54c4abacd034186d368d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 660.180236] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Successfully created port: 80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 661.891471] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Successfully created port: 1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 664.946326] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "398edc73-9487-4365-9e55-6eaa1f530f64" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 664.946601] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "398edc73-9487-4365-9e55-6eaa1f530f64" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 664.958237] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 665.021566] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 665.021566] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 665.023357] env[59490]: INFO nova.compute.claims [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 665.163257] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30299b77-d28c-4e60-8b66-9aa671af5e43 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.176094] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95a83267-2068-4694-aa44-285c4fe63620 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.217337] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0accf679-ab72-45a1-90d4-635dbccc9850 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.226636] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-224f2bc4-3243-490b-a1b6-1b4826a0fd8f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.245518] env[59490]: DEBUG nova.compute.provider_tree [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 665.261712] env[59490]: DEBUG nova.scheduler.client.report [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 665.286533] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.266s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 665.287478] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 665.325963] env[59490]: DEBUG nova.compute.utils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 665.327688] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 665.328334] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 665.340587] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 665.438469] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 665.465525] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 665.465758] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 665.465906] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 665.466093] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 665.466234] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 665.466401] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 665.466609] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 665.466762] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 665.466918] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 665.467095] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 665.467263] env[59490]: DEBUG nova.virt.hardware [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 665.468152] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d71c656-7844-436d-8c8d-cb1a968c107e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.477955] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3db98260-567c-4f4b-bf60-a282712ccd0b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 665.937089] env[59490]: DEBUG nova.policy [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93256992d7a84e72882b4c132c337393', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2133066748948909baea488349a4b78', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 666.762252] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Successfully updated port: 80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 666.778394] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 666.778552] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquired lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 666.778704] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 667.243434] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 667.819414] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "71698ce4-94a0-442c-8081-374616ce2ac4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.819695] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "71698ce4-94a0-442c-8081-374616ce2ac4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.830011] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 667.883445] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 667.883680] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 667.885697] env[59490]: INFO nova.compute.claims [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 667.959868] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Successfully updated port: 1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 667.976041] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 667.976801] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 667.976801] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 668.043705] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fabdfba-eda8-4b35-8eaa-ec068366ffd0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.052511] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dc772ee-cd69-48e8-b31f-dfe8d6f9d47a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.088833] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbd1b0c9-4e1f-4b4e-bc32-245861f617ae {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.099630] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8143f903-eec9-4778-9893-66c1b17d972b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.114790] env[59490]: DEBUG nova.compute.provider_tree [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 668.128924] env[59490]: DEBUG nova.scheduler.client.report [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 668.149321] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 668.149818] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 668.191831] env[59490]: DEBUG nova.compute.utils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 668.192337] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 668.192552] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 668.205032] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 668.305680] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 668.330921] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 668.330921] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 668.330921] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 668.331129] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 668.331234] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 668.331374] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 668.331579] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 668.331731] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 668.331889] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 668.335031] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 668.335031] env[59490]: DEBUG nova.virt.hardware [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 668.335031] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4ec6189-5c34-49f5-81a5-16111a3ea8a8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.345017] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7014671c-00ce-4c2e-8964-5e042a608401 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 668.439994] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Successfully created port: 1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 668.449611] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 668.719103] env[59490]: DEBUG nova.policy [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '34159240b1e1494381da9c477d79a652', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9880d419bb3d456c976df0d229b4323f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 669.086605] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Updating instance_info_cache with network_info: [{"id": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "address": "fa:16:3e:a7:2c:75", "network": {"id": "5e4e1842-a942-4bb5-b90a-0ba9c7439795", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2115201309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88b5a27edd5f4d9796aa911867721f9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80c207c0-df", "ovs_interfaceid": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 669.105200] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Releasing lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 669.105514] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Instance network_info: |[{"id": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "address": "fa:16:3e:a7:2c:75", "network": {"id": "5e4e1842-a942-4bb5-b90a-0ba9c7439795", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2115201309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88b5a27edd5f4d9796aa911867721f9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80c207c0-df", "ovs_interfaceid": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 669.105964] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:2c:75', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f85835c8-5d0c-4b2f-97c4-6c4006580f79', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '80c207c0-df3b-451a-924a-9a4d0cffc30b', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 669.120525] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Creating folder: Project (88b5a27edd5f4d9796aa911867721f9d). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.121105] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9399b8bf-ea89-433f-9489-79bdf0890fe2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.133549] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Created folder: Project (88b5a27edd5f4d9796aa911867721f9d) in parent group-v168905. [ 669.133666] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Creating folder: Instances. Parent ref: group-v168919. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 669.133887] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-53d88ac1-31d5-4d14-be30-fac44e3479de {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.146388] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Created folder: Instances in parent group-v168919. [ 669.146649] env[59490]: DEBUG oslo.service.loopingcall [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 669.146833] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 669.147039] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-2545eace-3b4d-4652-8dee-9a2a7c7b3b85 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.167545] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 669.167545] env[59490]: value = "task-707374" [ 669.167545] env[59490]: _type = "Task" [ 669.167545] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 669.175731] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707374, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 669.678653] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707374, 'name': CreateVM_Task, 'duration_secs': 0.303318} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 669.678841] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 669.712306] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 669.712480] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 669.712919] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 669.713075] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d279eff3-33f4-4e73-a37a-2ff7e81a54ea {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 669.723663] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Waiting for the task: (returnval){ [ 669.723663] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5222b5f9-7728-08de-efb8-3fcdf66b9651" [ 669.723663] env[59490]: _type = "Task" [ 669.723663] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 669.730486] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5222b5f9-7728-08de-efb8-3fcdf66b9651, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 669.867410] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "0ec55812-86b7-44ef-822a-88a2ff1816c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.867410] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "0ec55812-86b7-44ef-822a-88a2ff1816c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.891897] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 669.968574] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 669.968940] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 669.970636] env[59490]: INFO nova.compute.claims [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 670.159523] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96745ba3-6dd2-433f-96b1-31c5f77de72e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.170691] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-669045dd-4501-452b-a2d4-9a4ac3ebe91d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.203617] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Updating instance_info_cache with network_info: [{"id": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "address": "fa:16:3e:51:6d:cd", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2f9217-43", "ovs_interfaceid": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 670.205157] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0afc07c2-2d81-4c9f-93c6-f89814ccb552 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.213712] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a8bf8a-b95a-4197-b223-abecfcc55888 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.222199] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 670.222498] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Instance network_info: |[{"id": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "address": "fa:16:3e:51:6d:cd", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2f9217-43", "ovs_interfaceid": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 670.226242] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:51:6d:cd', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3c405e9f-a6c8-4308-acac-071654efe18e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1d2f9217-4341-4b1c-a20f-7eca46015c75', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 670.234638] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating folder: Project (312c91f87af54c4abacd034186d368d4). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 670.243925] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3a93f29d-4941-49ad-8d66-3a09b4539465 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.246388] env[59490]: DEBUG nova.compute.provider_tree [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 670.253928] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 670.254201] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 670.254882] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 670.258435] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created folder: Project (312c91f87af54c4abacd034186d368d4) in parent group-v168905. [ 670.258609] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating folder: Instances. Parent ref: group-v168922. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 670.259340] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-464b4890-7d31-4908-99a7-73affa6e2d9b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.261997] env[59490]: DEBUG nova.scheduler.client.report [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 670.275454] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created folder: Instances in parent group-v168922. [ 670.275454] env[59490]: DEBUG oslo.service.loopingcall [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 670.275454] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08923fae-e356-444d-b221-b40576b54af9] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 670.275454] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e65040d9-78be-4cbf-8c8c-5e16abec30d7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.291268] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.322s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 670.291748] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 670.299340] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 670.299340] env[59490]: value = "task-707377" [ 670.299340] env[59490]: _type = "Task" [ 670.299340] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 670.307968] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707377, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 670.327752] env[59490]: DEBUG nova.compute.utils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 670.329062] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 670.330023] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 670.339431] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 670.416211] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 670.442530] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 670.442771] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 670.442920] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 670.443107] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 670.443249] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 670.443389] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 670.443597] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 670.444256] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 670.444706] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 670.444706] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 670.444851] env[59490]: DEBUG nova.virt.hardware [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 670.445695] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-324ac65f-f8d6-4597-8919-977a77f26c1b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.455717] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-921cc210-69de-4de1-95de-c572742a1e18 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.624594] env[59490]: WARNING oslo_vmware.rw_handles [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 670.624594] env[59490]: ERROR oslo_vmware.rw_handles [ 670.624594] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 670.625987] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 670.626545] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Copying Virtual Disk [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/0749073a-45bc-46bf-8982-4fc1c428e8da/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 670.626773] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-aea358bf-3cea-488e-8948-5ceb92328c3b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.636875] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for the task: (returnval){ [ 670.636875] env[59490]: value = "task-707378" [ 670.636875] env[59490]: _type = "Task" [ 670.636875] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 670.647204] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Task: {'id': task-707378, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 670.811420] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707377, 'name': CreateVM_Task, 'duration_secs': 0.373843} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 670.811420] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 08923fae-e356-444d-b221-b40576b54af9] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 670.811610] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 670.811895] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 670.812230] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 670.815657] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-187d3f0a-1112-4380-a31f-18f532e37304 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 670.818227] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 670.818227] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52ff6415-283a-5253-989e-aca825d74df4" [ 670.818227] env[59490]: _type = "Task" [ 670.818227] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 670.827574] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52ff6415-283a-5253-989e-aca825d74df4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 671.035337] env[59490]: DEBUG nova.policy [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a42fbbb31ae444fd8eabf8c03e382984', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e6c6630fd91742da8a42a3c0ae5a5e04', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 671.148428] env[59490]: DEBUG oslo_vmware.exceptions [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 671.148428] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 671.148811] env[59490]: ERROR nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 671.148811] env[59490]: Faults: ['InvalidArgument'] [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Traceback (most recent call last): [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] yield resources [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self.driver.spawn(context, instance, image_meta, [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self._fetch_image_if_missing(context, vi) [ 671.148811] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] image_cache(vi, tmp_image_ds_loc) [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] vm_util.copy_virtual_disk( [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] session._wait_for_task(vmdk_copy_task) [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return self.wait_for_task(task_ref) [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return evt.wait() [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] result = hub.switch() [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 671.149356] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return self.greenlet.switch() [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self.f(*self.args, **self.kw) [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] raise exceptions.translate_fault(task_info.error) [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Faults: ['InvalidArgument'] [ 671.150015] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] [ 671.150015] env[59490]: INFO nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Terminating instance [ 671.150918] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 671.150918] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 671.151222] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0238f44b-b753-4cc2-8070-db25be930c48 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.153302] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 671.153451] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquired lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 671.154991] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 671.172649] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 671.172649] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 671.176370] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-30bc2b93-1ed5-41d7-b756-4f0e934d0799 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.187295] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Waiting for the task: (returnval){ [ 671.187295] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52ff02a3-8935-30ee-3337-2affd6c6ba68" [ 671.187295] env[59490]: _type = "Task" [ 671.187295] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 671.196323] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52ff02a3-8935-30ee-3337-2affd6c6ba68, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 671.302263] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 671.332154] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 671.332410] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 671.332638] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 671.696961] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 671.696961] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Creating directory with path [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 671.697189] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96ae6b57-b0c3-452e-8074-4a247b770392 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.721470] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Created directory with path [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 671.721736] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Fetch image to [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 671.721858] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 671.722672] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05e3ebb0-175f-4478-9a8b-28fc443cf2ab {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.733831] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69efd222-f8c4-497a-a873-d3656b457785 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.747349] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e83ce12-4e4e-4e11-92da-47ec500c093b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.785645] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-871aecec-349b-4523-8cdf-9bd26143bb24 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.793076] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-70eff99e-89e2-466c-8eac-5db7fef46994 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 671.816079] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 671.872008] env[59490]: DEBUG oslo_vmware.rw_handles [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 671.938982] env[59490]: DEBUG oslo_vmware.rw_handles [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 671.939191] env[59490]: DEBUG oslo_vmware.rw_handles [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 672.105054] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 672.116180] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Releasing lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 672.116889] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 672.116955] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 672.119116] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06beb37e-a14c-4696-a7a4-751d8817e67b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.127595] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 672.127883] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a006fef4-9c55-4e22-b02f-de8e207ccd95 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.152508] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 672.154096] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 672.154096] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Deleting the datastore file [datastore2] e9f81c59-44ea-4276-a310-7581e3a7abb1 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 672.154096] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-324d6692-37ca-4bc1-bab7-b7711e762151 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.165034] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for the task: (returnval){ [ 672.165034] env[59490]: value = "task-707380" [ 672.165034] env[59490]: _type = "Task" [ 672.165034] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 672.170986] env[59490]: DEBUG nova.compute.manager [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Received event network-vif-plugged-80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 672.175287] env[59490]: DEBUG oslo_concurrency.lockutils [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] Acquiring lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.175287] env[59490]: DEBUG oslo_concurrency.lockutils [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] Lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.002s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.175287] env[59490]: DEBUG oslo_concurrency.lockutils [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] Lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.175287] env[59490]: DEBUG nova.compute.manager [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] No waiting events found dispatching network-vif-plugged-80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 672.175999] env[59490]: WARNING nova.compute.manager [req-7b0f4b73-c8c3-4d70-9cff-692f0e7bdc6a req-6bf602ef-3173-4ba4-8e6b-7338099d7711 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Received unexpected event network-vif-plugged-80c207c0-df3b-451a-924a-9a4d0cffc30b for instance with vm_state building and task_state spawning. [ 672.184916] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Task: {'id': task-707380, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 672.333839] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Successfully updated port: 1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 672.344782] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 672.344929] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 672.345107] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 672.461681] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 672.471379] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Successfully created port: 012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 672.679255] env[59490]: DEBUG oslo_vmware.api [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Task: {'id': task-707380, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.037451} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 672.679608] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 672.679808] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 672.680345] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 672.680345] env[59490]: INFO nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Took 0.56 seconds to destroy the instance on the hypervisor. [ 672.680667] env[59490]: DEBUG oslo.service.loopingcall [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 672.680667] env[59490]: DEBUG nova.compute.manager [-] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 672.685632] env[59490]: DEBUG nova.compute.claims [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 672.685807] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 672.686030] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 672.839416] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-704fecee-ed53-4441-ac88-ea3166316c8a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.849566] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc0f8818-ed2c-4fc7-90c4-3614468d540b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.882549] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e24e30e3-6dc8-4259-b85d-05da0f8c9036 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.892919] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b443ad24-4b3a-4688-b057-c71b0773afb3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 672.906907] env[59490]: DEBUG nova.compute.provider_tree [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 672.916389] env[59490]: DEBUG nova.scheduler.client.report [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 672.932363] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.246s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 672.932801] env[59490]: ERROR nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.932801] env[59490]: Faults: ['InvalidArgument'] [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Traceback (most recent call last): [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self.driver.spawn(context, instance, image_meta, [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self._vmops.spawn(context, instance, image_meta, injected_files, [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self._fetch_image_if_missing(context, vi) [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] image_cache(vi, tmp_image_ds_loc) [ 672.932801] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] vm_util.copy_virtual_disk( [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] session._wait_for_task(vmdk_copy_task) [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return self.wait_for_task(task_ref) [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return evt.wait() [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] result = hub.switch() [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] return self.greenlet.switch() [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 672.933202] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] self.f(*self.args, **self.kw) [ 672.933570] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 672.933570] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] raise exceptions.translate_fault(task_info.error) [ 672.933570] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 672.933570] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Faults: ['InvalidArgument'] [ 672.933570] env[59490]: ERROR nova.compute.manager [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] [ 672.933570] env[59490]: DEBUG nova.compute.utils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 672.935498] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Build of instance e9f81c59-44ea-4276-a310-7581e3a7abb1 was re-scheduled: A specified parameter was not correct: fileType [ 672.935498] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 672.935868] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 672.936109] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquiring lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 672.936292] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Acquired lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 672.936488] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 673.079711] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 673.260464] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.260794] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.269854] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 673.327208] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 673.327553] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.329678] env[59490]: INFO nova.compute.claims [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 673.513927] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Updating instance_info_cache with network_info: [{"id": "1e015731-afb6-494c-a1c9-cada19c93973", "address": "fa:16:3e:15:80:2f", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1e015731-af", "ovs_interfaceid": "1e015731-afb6-494c-a1c9-cada19c93973", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.519036] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18f1cf33-c4d4-40b5-86ff-b134e39f0e4b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.525278] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-529ef626-cc39-4a7a-b186-f733cf4f229b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.563380] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a166cf1-4df0-4dac-806e-550c92e446c2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.566659] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 673.567009] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Instance network_info: |[{"id": "1e015731-afb6-494c-a1c9-cada19c93973", "address": "fa:16:3e:15:80:2f", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1e015731-af", "ovs_interfaceid": "1e015731-afb6-494c-a1c9-cada19c93973", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 673.567683] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:15:80:2f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3ac3fd84-c373-49f5-82dc-784a6cdb686d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1e015731-afb6-494c-a1c9-cada19c93973', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 673.576046] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating folder: Project (c2133066748948909baea488349a4b78). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 673.577367] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8c5d5a6a-abca-47fa-a41d-ae3413b343aa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.582751] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7665607b-a7e9-4656-aa64-edfe47f79fce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.596474] env[59490]: DEBUG nova.compute.provider_tree [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 673.599119] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created folder: Project (c2133066748948909baea488349a4b78) in parent group-v168905. [ 673.601267] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating folder: Instances. Parent ref: group-v168925. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 673.601685] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e96a1823-c439-4b21-923e-a76653b11143 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.608349] env[59490]: DEBUG nova.scheduler.client.report [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 673.614084] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created folder: Instances in parent group-v168925. [ 673.614084] env[59490]: DEBUG oslo.service.loopingcall [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 673.614084] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 673.614251] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c71ba951-4cf3-4cd9-96d9-08c96ff4ad90 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.631023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.631535] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 673.640626] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 673.640626] env[59490]: value = "task-707383" [ 673.640626] env[59490]: _type = "Task" [ 673.640626] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 673.651611] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707383, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 673.676027] env[59490]: DEBUG nova.compute.utils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 673.684762] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 673.684762] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 673.693634] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 673.718721] env[59490]: DEBUG nova.network.neutron [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 673.742262] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Releasing lock "refresh_cache-e9f81c59-44ea-4276-a310-7581e3a7abb1" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 673.742262] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 673.742262] env[59490]: DEBUG nova.compute.manager [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 673.807829] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 673.842854] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 673.843357] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 673.844185] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 673.844185] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 673.844185] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 673.844185] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 673.844891] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 673.844891] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 673.845020] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 673.845421] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 673.845421] env[59490]: DEBUG nova.virt.hardware [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 673.846284] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0883b62e-328c-4c6b-bc27-6c4a1d0dc425 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.861828] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3248965-bd22-4636-800d-39e8e71ca17c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 673.887891] env[59490]: INFO nova.scheduler.client.report [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Deleted allocations for instance e9f81c59-44ea-4276-a310-7581e3a7abb1 [ 673.916744] env[59490]: DEBUG oslo_concurrency.lockutils [None req-0e9be15a-09c2-448d-bf0e-b78337a3023b tempest-ServersAdmin275Test-735062589 tempest-ServersAdmin275Test-735062589-project-member] Lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 103.180s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 673.917215] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 93.300s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 673.917215] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e9f81c59-44ea-4276-a310-7581e3a7abb1] During sync_power_state the instance has a pending task (spawning). Skip. [ 673.917341] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "e9f81c59-44ea-4276-a310-7581e3a7abb1" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 674.136108] env[59490]: DEBUG nova.policy [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a432a4ea8e3443e3a14281c3a3ffab77', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'df2c0efc6a764e93809ce650abd4a2ee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 674.154556] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707383, 'name': CreateVM_Task, 'duration_secs': 0.309892} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 674.154556] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 674.154860] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 674.155250] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 674.156211] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 674.156211] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-71a633a5-6ffc-4a46-be5a-ea63da64cfc4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 674.161906] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 674.161906] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52359a9d-f59f-804d-e016-76f99d75cde4" [ 674.161906] env[59490]: _type = "Task" [ 674.161906] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 674.170987] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52359a9d-f59f-804d-e016-76f99d75cde4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 674.523581] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Successfully created port: ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 674.674183] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 674.674433] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 674.674647] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 675.136131] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.136412] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.146572] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 675.197077] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 675.197326] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 675.198822] env[59490]: INFO nova.compute.claims [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 675.393390] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d6fbc3b-0696-49ef-9cda-3cab9b902383 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.405097] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6397f346-bb42-4939-b163-47b3ffef6ec3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.441292] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f67bebc-96b0-49ef-b075-35251056a973 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.449418] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da88cceb-0d29-49f7-96b7-533c63d93956 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.464885] env[59490]: DEBUG nova.compute.provider_tree [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 675.475366] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 675.494556] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.297s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 675.496344] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 675.531974] env[59490]: DEBUG nova.compute.utils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 675.533703] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Not allocating networking since 'none' was specified. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 675.542951] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 675.619532] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 675.649615] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 675.649811] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 675.650112] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 675.650509] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 675.651667] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 675.651667] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 675.651667] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 675.651872] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 675.652034] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 675.652229] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 675.652432] env[59490]: DEBUG nova.virt.hardware [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 675.653417] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c97b7bb3-d141-434d-91fd-8b88a5593ee0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.668626] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b61e2d52-2765-44a8-a8cc-4d59c43e6c88 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.683098] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Instance VIF info [] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 675.688900] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating folder: Project (c9c75e1c484c451cacff62241186b865). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 675.689206] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-126c4e1c-f546-4869-a452-aa8a6560c06e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.703653] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created folder: Project (c9c75e1c484c451cacff62241186b865) in parent group-v168905. [ 675.703770] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating folder: Instances. Parent ref: group-v168928. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 675.704090] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8e956f82-85db-42dd-a7e0-da2233535690 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.715250] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created folder: Instances in parent group-v168928. [ 675.715873] env[59490]: DEBUG oslo.service.loopingcall [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 675.715873] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 675.715873] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d1eded66-8323-429a-8ff6-8c309d494485 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 675.737354] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 675.737354] env[59490]: value = "task-707386" [ 675.737354] env[59490]: _type = "Task" [ 675.737354] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 675.745443] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707386, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 676.251091] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707386, 'name': CreateVM_Task, 'duration_secs': 0.248524} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 676.251275] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 676.251693] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 676.251845] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 676.252205] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 676.252449] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eb90a7bf-55d0-45c3-96a2-4f7887d9ec72 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 676.257197] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 676.257197] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52e9b71b-8192-d26e-4cb3-38e3d8889b04" [ 676.257197] env[59490]: _type = "Task" [ 676.257197] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 676.266089] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52e9b71b-8192-d26e-4cb3-38e3d8889b04, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 676.385034] env[59490]: DEBUG nova.compute.manager [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Received event network-vif-plugged-1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 676.385034] env[59490]: DEBUG oslo_concurrency.lockutils [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] Acquiring lock "398edc73-9487-4365-9e55-6eaa1f530f64-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.385034] env[59490]: DEBUG oslo_concurrency.lockutils [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] Lock "398edc73-9487-4365-9e55-6eaa1f530f64-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.385034] env[59490]: DEBUG oslo_concurrency.lockutils [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] Lock "398edc73-9487-4365-9e55-6eaa1f530f64-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 676.385222] env[59490]: DEBUG nova.compute.manager [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] No waiting events found dispatching network-vif-plugged-1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 676.385222] env[59490]: WARNING nova.compute.manager [req-4874a292-5a17-4706-b554-08ac71e6cbdb req-963d4d1e-4626-4e86-807c-ac37d6d967f8 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Received unexpected event network-vif-plugged-1e015731-afb6-494c-a1c9-cada19c93973 for instance with vm_state building and task_state spawning. [ 676.632560] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Successfully created port: b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 676.769610] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 676.769863] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 676.770084] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 676.939368] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 676.939647] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 676.954334] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 677.019465] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.019729] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.021275] env[59490]: INFO nova.compute.claims [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 677.036225] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Successfully updated port: 012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 677.052622] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 677.052622] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquired lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 677.052622] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 677.224995] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 677.262990] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-825180c9-fb7b-4b8e-9b5f-67cdb428f238 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.272358] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4336164-982d-4ee6-bd0a-046ceb3a8e11 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.304471] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-89404802-7f07-41da-9471-bf9892371616 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.313892] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb8d5e8-f82c-4b4d-8263-4414970926b0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.327181] env[59490]: DEBUG nova.compute.provider_tree [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 677.339626] env[59490]: DEBUG nova.scheduler.client.report [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 677.354958] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.335s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.355345] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 677.390962] env[59490]: DEBUG nova.compute.utils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 677.393290] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 677.393462] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 677.406487] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 677.507823] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 677.548453] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 677.548696] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 677.548846] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 677.549145] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 677.549421] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 677.549526] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 677.549664] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 677.549877] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 677.549963] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 677.550132] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 677.550302] env[59490]: DEBUG nova.virt.hardware [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 677.551188] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-626f4194-a54f-4065-9a5a-30c99aba8e83 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.565183] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Received event network-vif-plugged-1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 677.565303] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Acquiring lock "08923fae-e356-444d-b221-b40576b54af9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 677.565515] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Lock "08923fae-e356-444d-b221-b40576b54af9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 677.565668] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Lock "08923fae-e356-444d-b221-b40576b54af9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 677.565835] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] No waiting events found dispatching network-vif-plugged-1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 677.566523] env[59490]: WARNING nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Received unexpected event network-vif-plugged-1d2f9217-4341-4b1c-a20f-7eca46015c75 for instance with vm_state building and task_state spawning. [ 677.566523] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Received event network-changed-80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 677.566523] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Refreshing instance network info cache due to event network-changed-80c207c0-df3b-451a-924a-9a4d0cffc30b. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 677.567141] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Acquiring lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 677.567141] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Acquired lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 677.567141] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Refreshing network info cache for port 80c207c0-df3b-451a-924a-9a4d0cffc30b {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 677.574876] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34e6f145-ed72-4f07-a557-1abffc0b88ff {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 677.837824] env[59490]: DEBUG nova.policy [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4692ee9846954c52bb460f5181a27c4e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'efca172fc0d849a1bda4106da1369768', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 678.836362] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Updating instance_info_cache with network_info: [{"id": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "address": "fa:16:3e:a3:94:e9", "network": {"id": "861dfd56-9952-4fd6-907a-aa2cc4fc3dcc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1289206676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9880d419bb3d456c976df0d229b4323f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cb971244-43ba-41b4-a6a2-a4558548012c", "external-id": "nsx-vlan-transportzone-873", "segmentation_id": 873, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap012003c2-2c", "ovs_interfaceid": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 678.855866] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Releasing lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 678.856196] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance network_info: |[{"id": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "address": "fa:16:3e:a3:94:e9", "network": {"id": "861dfd56-9952-4fd6-907a-aa2cc4fc3dcc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1289206676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9880d419bb3d456c976df0d229b4323f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cb971244-43ba-41b4-a6a2-a4558548012c", "external-id": "nsx-vlan-transportzone-873", "segmentation_id": 873, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap012003c2-2c", "ovs_interfaceid": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 678.856618] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a3:94:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'cb971244-43ba-41b4-a6a2-a4558548012c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '012003c2-2cb2-4fd7-87d7-79aa1f6c4a50', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 678.866435] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Creating folder: Project (9880d419bb3d456c976df0d229b4323f). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 678.868233] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Successfully updated port: ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 678.872243] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2498b191-7810-403d-9151-c3f512acfc6b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.880227] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 678.880364] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquired lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 678.880547] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 678.884568] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Created folder: Project (9880d419bb3d456c976df0d229b4323f) in parent group-v168905. [ 678.884984] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Creating folder: Instances. Parent ref: group-v168931. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 678.890016] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7ee23efa-cfc5-408e-981b-655926eaa4ac {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.901292] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Created folder: Instances in parent group-v168931. [ 678.901530] env[59490]: DEBUG oslo.service.loopingcall [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 678.901707] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 678.901900] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3a13311a-3b65-4320-abbb-bcc1cd19f3cc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 678.925621] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 678.925621] env[59490]: value = "task-707389" [ 678.925621] env[59490]: _type = "Task" [ 678.925621] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 678.935502] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707389, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 679.084156] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 679.439261] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707389, 'name': CreateVM_Task, 'duration_secs': 0.330815} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 679.439437] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 679.440356] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.440476] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 679.440832] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 679.441357] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-90c9295b-5998-453f-889e-56ccf10d3032 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 679.449753] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Waiting for the task: (returnval){ [ 679.449753] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52b3e784-0968-b9a1-f8b5-76da16ae4da2" [ 679.449753] env[59490]: _type = "Task" [ 679.449753] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 679.459246] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52b3e784-0968-b9a1-f8b5-76da16ae4da2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 679.785758] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "31207de9-e903-4ed4-bccc-c0796edec34b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.785971] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "31207de9-e903-4ed4-bccc-c0796edec34b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.798632] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Successfully updated port: b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 679.805749] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 679.808506] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.808724] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquired lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 679.808838] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 679.870304] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 679.870651] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 679.872177] env[59490]: INFO nova.compute.claims [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 679.917206] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 679.944856] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Updated VIF entry in instance network info cache for port 80c207c0-df3b-451a-924a-9a4d0cffc30b. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 679.945033] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Updating instance_info_cache with network_info: [{"id": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "address": "fa:16:3e:a7:2c:75", "network": {"id": "5e4e1842-a942-4bb5-b90a-0ba9c7439795", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-2115201309-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "88b5a27edd5f4d9796aa911867721f9d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f85835c8-5d0c-4b2f-97c4-6c4006580f79", "external-id": "nsx-vlan-transportzone-245", "segmentation_id": 245, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap80c207c0-df", "ovs_interfaceid": "80c207c0-df3b-451a-924a-9a4d0cffc30b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 679.968219] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 679.968537] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 679.968873] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.969286] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Releasing lock "refresh_cache-6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 679.969517] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Received event network-changed-1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 679.969705] env[59490]: DEBUG nova.compute.manager [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Refreshing instance network info cache due to event network-changed-1d2f9217-4341-4b1c-a20f-7eca46015c75. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 679.969897] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Acquiring lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 679.970038] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Acquired lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 679.970189] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Refreshing network info cache for port 1d2f9217-4341-4b1c-a20f-7eca46015c75 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 680.077250] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df83a5d3-a042-49e7-827b-19f61a4401ac {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.089446] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3577d20f-0c3c-40c2-99a5-d7b3252b7247 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.122992] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-618b7040-ee49-47a4-8e42-acc1d51cc916 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.132632] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d01a27fb-379c-437c-8fcd-3736aaa5534d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.147277] env[59490]: DEBUG nova.compute.provider_tree [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 680.162633] env[59490]: DEBUG nova.scheduler.client.report [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 680.179084] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.308s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 680.179256] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 680.225879] env[59490]: DEBUG nova.compute.utils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 680.227393] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 680.227700] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 680.245043] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 680.312613] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Updating instance_info_cache with network_info: [{"id": "b8dc67a8-c070-49a3-af75-8091871a2e25", "address": "fa:16:3e:b4:8a:0f", "network": {"id": "0dd7ff06-26af-46e5-b25b-e81bfb0d2a84", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1202834820-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "df2c0efc6a764e93809ce650abd4a2ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8dc67a8-c0", "ovs_interfaceid": "b8dc67a8-c070-49a3-af75-8091871a2e25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 680.326091] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Releasing lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 680.326445] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance network_info: |[{"id": "b8dc67a8-c070-49a3-af75-8091871a2e25", "address": "fa:16:3e:b4:8a:0f", "network": {"id": "0dd7ff06-26af-46e5-b25b-e81bfb0d2a84", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1202834820-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "df2c0efc6a764e93809ce650abd4a2ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8dc67a8-c0", "ovs_interfaceid": "b8dc67a8-c070-49a3-af75-8091871a2e25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 680.327504] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b4:8a:0f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd7836a5b-a91e-4d3f-8e96-afe024f62bb5', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b8dc67a8-c070-49a3-af75-8091871a2e25', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 680.340386] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Creating folder: Project (df2c0efc6a764e93809ce650abd4a2ee). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.340386] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9a96bf4f-df46-4c5a-a1ed-08c707e730fe {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.341557] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 680.357770] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Created folder: Project (df2c0efc6a764e93809ce650abd4a2ee) in parent group-v168905. [ 680.357961] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Creating folder: Instances. Parent ref: group-v168934. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.358229] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-03f3fa56-2e63-4e24-a96e-db5621b28e5a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.368352] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Created folder: Instances in parent group-v168934. [ 680.368708] env[59490]: DEBUG oslo.service.loopingcall [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 680.368809] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 680.370846] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b45bc5cf-c383-4cd8-892e-5d740c3c9462 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.389511] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 680.389613] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 680.390444] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 680.390444] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 680.390444] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 680.390444] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 680.390444] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 680.390706] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 680.390706] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 680.390825] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 680.390980] env[59490]: DEBUG nova.virt.hardware [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 680.392147] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-138b1f2e-9fc0-422c-b245-c3d387d5b761 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.404562] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 680.404562] env[59490]: value = "task-707392" [ 680.404562] env[59490]: _type = "Task" [ 680.404562] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.405768] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61cb2076-7ea1-4d92-8412-d3144d140c0e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.419667] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707392, 'name': CreateVM_Task} progress is 6%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.506546] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Updating instance_info_cache with network_info: [{"id": "ec348de0-422c-4320-b593-d676a35120fa", "address": "fa:16:3e:10:7a:aa", "network": {"id": "cd43e8d3-18cd-4c0e-b748-36432a117b9d", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-317203583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6c6630fd91742da8a42a3c0ae5a5e04", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec348de0-42", "ovs_interfaceid": "ec348de0-422c-4320-b593-d676a35120fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 680.524871] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Releasing lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 680.525212] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance network_info: |[{"id": "ec348de0-422c-4320-b593-d676a35120fa", "address": "fa:16:3e:10:7a:aa", "network": {"id": "cd43e8d3-18cd-4c0e-b748-36432a117b9d", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-317203583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6c6630fd91742da8a42a3c0ae5a5e04", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec348de0-42", "ovs_interfaceid": "ec348de0-422c-4320-b593-d676a35120fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 680.525611] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:10:7a:aa', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'e23c1d18-c841-49ea-95f3-df5ceac28afd', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ec348de0-422c-4320-b593-d676a35120fa', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 680.540992] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Creating folder: Project (e6c6630fd91742da8a42a3c0ae5a5e04). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.541636] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-5caa1c6f-5cd3-4239-99c2-cf2b94c051ae {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.554663] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Created folder: Project (e6c6630fd91742da8a42a3c0ae5a5e04) in parent group-v168905. [ 680.555233] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Creating folder: Instances. Parent ref: group-v168937. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 680.555233] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-013b934f-f624-4157-946e-4283699e7b7c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.574722] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Created folder: Instances in parent group-v168937. [ 680.574722] env[59490]: DEBUG oslo.service.loopingcall [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 680.574722] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 680.574722] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f2decbc2-7bc3-4b5a-b9d7-29c44f53ef94 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.599391] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 680.599391] env[59490]: value = "task-707395" [ 680.599391] env[59490]: _type = "Task" [ 680.599391] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.610129] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707395, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 680.617111] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Successfully created port: 72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 680.840139] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "2f083456-3eb9-4022-86a3-8d39f83c470f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.840379] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "2f083456-3eb9-4022-86a3-8d39f83c470f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.853074] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 680.908958] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 680.909221] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 680.911171] env[59490]: INFO nova.compute.claims [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 680.924836] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707392, 'name': CreateVM_Task, 'duration_secs': 0.305662} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 680.924836] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 680.924836] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 680.924836] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 680.928070] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 680.928070] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5327bfa2-652e-48a7-99bd-db728102617c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 680.931774] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Waiting for the task: (returnval){ [ 680.931774] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5221a067-4dc7-724d-e5ba-0a20508d4506" [ 680.931774] env[59490]: _type = "Task" [ 680.931774] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 680.940459] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5221a067-4dc7-724d-e5ba-0a20508d4506, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.099419] env[59490]: DEBUG nova.policy [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5d7c2bfeaf0b405281fa5a9ef525d64d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '871a7d77e44d42e88d6e3dfffd9a6c5f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 681.113142] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707395, 'name': CreateVM_Task, 'duration_secs': 0.325096} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 681.114796] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 681.115886] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 681.156505] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a11cf78-ad5a-4349-9f94-994584c4fc1d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.163626] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9b1f327-691c-4440-836c-400f3ffb0486 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.196619] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3b53482-4b4a-4c72-ad2d-0084b942da89 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.204666] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09229de6-bdf0-47bd-a13d-933d3599d325 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.220893] env[59490]: DEBUG nova.compute.provider_tree [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 681.231058] env[59490]: DEBUG nova.scheduler.client.report [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 681.250735] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.340s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 681.250735] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 681.285686] env[59490]: DEBUG nova.compute.utils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 681.290021] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 681.290021] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 681.297234] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 681.388130] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 681.413448] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 681.413716] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 681.413864] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 681.414056] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 681.414201] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 681.414341] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 681.414543] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 681.417252] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 681.417252] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 681.417252] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 681.417252] env[59490]: DEBUG nova.virt.hardware [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 681.417252] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acb9d3c8-35bc-4af2-98fe-bcce34205efa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.426799] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b32fb4d8-9d07-4ceb-aa36-65ef925bc555 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.449408] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 681.449810] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 681.450067] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 681.450277] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 681.450644] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 681.453570] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ec57de2-2004-4b23-9799-f443fd9e03f3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 681.458888] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Waiting for the task: (returnval){ [ 681.458888] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52f0f901-d43b-466c-169b-f23535205888" [ 681.458888] env[59490]: _type = "Task" [ 681.458888] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 681.466364] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52f0f901-d43b-466c-169b-f23535205888, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 681.473402] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "581848be-38fb-42da-b723-480bf297d1a5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 681.473611] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "581848be-38fb-42da-b723-480bf297d1a5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 681.751086] env[59490]: DEBUG nova.policy [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'cfe65946aa144d2197e05a0df4836768', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e0f71e94488943f9b7989439be3152d5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 681.970352] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 681.970964] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 681.971328] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.132739] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.133180] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.380241] env[59490]: DEBUG nova.compute.manager [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Received event network-changed-1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 682.380241] env[59490]: DEBUG nova.compute.manager [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Refreshing instance network info cache due to event network-changed-1e015731-afb6-494c-a1c9-cada19c93973. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 682.380241] env[59490]: DEBUG oslo_concurrency.lockutils [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] Acquiring lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.380241] env[59490]: DEBUG oslo_concurrency.lockutils [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] Acquired lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 682.381424] env[59490]: DEBUG nova.network.neutron [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Refreshing network info cache for port 1e015731-afb6-494c-a1c9-cada19c93973 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 682.534652] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Updated VIF entry in instance network info cache for port 1d2f9217-4341-4b1c-a20f-7eca46015c75. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 682.534652] env[59490]: DEBUG nova.network.neutron [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] [instance: 08923fae-e356-444d-b221-b40576b54af9] Updating instance_info_cache with network_info: [{"id": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "address": "fa:16:3e:51:6d:cd", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1d2f9217-43", "ovs_interfaceid": "1d2f9217-4341-4b1c-a20f-7eca46015c75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.543112] env[59490]: DEBUG oslo_concurrency.lockutils [req-71374a0f-5436-4041-b03f-e4b5bb5cf13b req-d4d6a06d-2b6a-48a2-bd3c-8ea6de2c0306 service nova] Releasing lock "refresh_cache-08923fae-e356-444d-b221-b40576b54af9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.938188] env[59490]: DEBUG nova.network.neutron [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Updated VIF entry in instance network info cache for port 1e015731-afb6-494c-a1c9-cada19c93973. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 682.938530] env[59490]: DEBUG nova.network.neutron [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Updating instance_info_cache with network_info: [{"id": "1e015731-afb6-494c-a1c9-cada19c93973", "address": "fa:16:3e:15:80:2f", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1e015731-af", "ovs_interfaceid": "1e015731-afb6-494c-a1c9-cada19c93973", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 682.950993] env[59490]: DEBUG oslo_concurrency.lockutils [req-4cc5bc22-6115-4a81-bf89-3132c12c9d12 req-9c04ee40-5ddb-432a-b17e-bbde2d974f78 service nova] Releasing lock "refresh_cache-398edc73-9487-4365-9e55-6eaa1f530f64" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 682.989932] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Received event network-vif-plugged-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 682.990209] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquiring lock "71698ce4-94a0-442c-8081-374616ce2ac4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 682.990413] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Lock "71698ce4-94a0-442c-8081-374616ce2ac4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 682.990813] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Lock "71698ce4-94a0-442c-8081-374616ce2ac4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 682.991027] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] No waiting events found dispatching network-vif-plugged-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 682.991327] env[59490]: WARNING nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Received unexpected event network-vif-plugged-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 for instance with vm_state building and task_state spawning. [ 682.991446] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Received event network-changed-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 682.991650] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Refreshing instance network info cache due to event network-changed-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 682.991853] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquiring lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 682.992018] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquired lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 682.992201] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Refreshing network info cache for port 012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 683.151425] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "3464c5af-60a4-4b6d-b7ca-51cf7312cf09" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.151666] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "3464c5af-60a4-4b6d-b7ca-51cf7312cf09" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.464448] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Updated VIF entry in instance network info cache for port 012003c2-2cb2-4fd7-87d7-79aa1f6c4a50. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 683.464448] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Updating instance_info_cache with network_info: [{"id": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "address": "fa:16:3e:a3:94:e9", "network": {"id": "861dfd56-9952-4fd6-907a-aa2cc4fc3dcc", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-1289206676-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9880d419bb3d456c976df0d229b4323f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "cb971244-43ba-41b4-a6a2-a4558548012c", "external-id": "nsx-vlan-transportzone-873", "segmentation_id": 873, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap012003c2-2c", "ovs_interfaceid": "012003c2-2cb2-4fd7-87d7-79aa1f6c4a50", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 683.481100] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Releasing lock "refresh_cache-71698ce4-94a0-442c-8081-374616ce2ac4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 683.481100] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Received event network-vif-plugged-ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 683.481100] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquiring lock "0ec55812-86b7-44ef-822a-88a2ff1816c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 683.481100] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Lock "0ec55812-86b7-44ef-822a-88a2ff1816c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 683.481222] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Lock "0ec55812-86b7-44ef-822a-88a2ff1816c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 683.481222] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] No waiting events found dispatching network-vif-plugged-ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 683.481222] env[59490]: WARNING nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Received unexpected event network-vif-plugged-ec348de0-422c-4320-b593-d676a35120fa for instance with vm_state building and task_state spawning. [ 683.481222] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Received event network-changed-ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 683.481331] env[59490]: DEBUG nova.compute.manager [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Refreshing instance network info cache due to event network-changed-ec348de0-422c-4320-b593-d676a35120fa. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 683.481331] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquiring lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 683.481331] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Acquired lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 683.481331] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Refreshing network info cache for port ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 683.776423] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Successfully created port: 81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 684.578994] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Updated VIF entry in instance network info cache for port ec348de0-422c-4320-b593-d676a35120fa. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 684.579355] env[59490]: DEBUG nova.network.neutron [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Updating instance_info_cache with network_info: [{"id": "ec348de0-422c-4320-b593-d676a35120fa", "address": "fa:16:3e:10:7a:aa", "network": {"id": "cd43e8d3-18cd-4c0e-b748-36432a117b9d", "bridge": "br-int", "label": "tempest-ServerMetadataTestJSON-317203583-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e6c6630fd91742da8a42a3c0ae5a5e04", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "e23c1d18-c841-49ea-95f3-df5ceac28afd", "external-id": "nsx-vlan-transportzone-774", "segmentation_id": 774, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapec348de0-42", "ovs_interfaceid": "ec348de0-422c-4320-b593-d676a35120fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 684.593152] env[59490]: DEBUG oslo_concurrency.lockutils [req-b6ef926f-7e95-4a23-950a-c565432f321a req-20eea888-97a6-44cb-8bb6-253c6d7ed2ae service nova] Releasing lock "refresh_cache-0ec55812-86b7-44ef-822a-88a2ff1816c3" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 684.723272] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Successfully created port: b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 685.130544] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Successfully updated port: 72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 685.143988] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 685.144804] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquired lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 685.145495] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 685.357639] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 686.208776] env[59490]: DEBUG nova.compute.manager [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Received event network-vif-plugged-b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 686.209042] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Acquiring lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.209529] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.209782] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.209984] env[59490]: DEBUG nova.compute.manager [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] No waiting events found dispatching network-vif-plugged-b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 686.210162] env[59490]: WARNING nova.compute.manager [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Received unexpected event network-vif-plugged-b8dc67a8-c070-49a3-af75-8091871a2e25 for instance with vm_state building and task_state spawning. [ 686.210318] env[59490]: DEBUG nova.compute.manager [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Received event network-changed-b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 686.210465] env[59490]: DEBUG nova.compute.manager [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Refreshing instance network info cache due to event network-changed-b8dc67a8-c070-49a3-af75-8091871a2e25. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 686.210643] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Acquiring lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 686.210774] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Acquired lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 686.210925] env[59490]: DEBUG nova.network.neutron [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Refreshing network info cache for port b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 686.660352] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Updating instance_info_cache with network_info: [{"id": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "address": "fa:16:3e:ab:87:b1", "network": {"id": "6b17303f-fe3c-438f-8ffa-458e67f9a924", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-759847990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "efca172fc0d849a1bda4106da1369768", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72ccc8c5-8b", "ovs_interfaceid": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 686.675095] env[59490]: DEBUG nova.compute.manager [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Received event network-vif-plugged-72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 686.675095] env[59490]: DEBUG oslo_concurrency.lockutils [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] Acquiring lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 686.675095] env[59490]: DEBUG oslo_concurrency.lockutils [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] Lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 686.675095] env[59490]: DEBUG oslo_concurrency.lockutils [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] Lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 686.675237] env[59490]: DEBUG nova.compute.manager [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] No waiting events found dispatching network-vif-plugged-72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 686.675237] env[59490]: WARNING nova.compute.manager [req-5519b52f-1ed2-4706-b9d6-9413e243ba15 req-3be254fa-1264-446b-8b73-ffd273228e90 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Received unexpected event network-vif-plugged-72ccc8c5-8b84-4ce5-a12a-16920839c294 for instance with vm_state building and task_state spawning. [ 686.685734] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Releasing lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 686.685734] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance network_info: |[{"id": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "address": "fa:16:3e:ab:87:b1", "network": {"id": "6b17303f-fe3c-438f-8ffa-458e67f9a924", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-759847990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "efca172fc0d849a1bda4106da1369768", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72ccc8c5-8b", "ovs_interfaceid": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 686.685877] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ab:87:b1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '72ccc8c5-8b84-4ce5-a12a-16920839c294', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 686.694107] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Creating folder: Project (efca172fc0d849a1bda4106da1369768). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 686.694817] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a2933742-bc1e-4c92-a13c-26633d58015f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.705751] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Created folder: Project (efca172fc0d849a1bda4106da1369768) in parent group-v168905. [ 686.705948] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Creating folder: Instances. Parent ref: group-v168940. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 686.708817] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7a53b956-1c74-45d0-9766-80ceac2aa9c0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.718744] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Created folder: Instances in parent group-v168940. [ 686.718744] env[59490]: DEBUG oslo.service.loopingcall [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 686.718908] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 686.719127] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d5365dde-0178-4944-8c7b-566793cc3582 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 686.738982] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 686.738982] env[59490]: value = "task-707398" [ 686.738982] env[59490]: _type = "Task" [ 686.738982] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 686.747264] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707398, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 687.212188] env[59490]: DEBUG nova.network.neutron [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Updated VIF entry in instance network info cache for port b8dc67a8-c070-49a3-af75-8091871a2e25. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 687.212188] env[59490]: DEBUG nova.network.neutron [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Updating instance_info_cache with network_info: [{"id": "b8dc67a8-c070-49a3-af75-8091871a2e25", "address": "fa:16:3e:b4:8a:0f", "network": {"id": "0dd7ff06-26af-46e5-b25b-e81bfb0d2a84", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-1202834820-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "df2c0efc6a764e93809ce650abd4a2ee", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d7836a5b-a91e-4d3f-8e96-afe024f62bb5", "external-id": "nsx-vlan-transportzone-419", "segmentation_id": 419, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8dc67a8-c0", "ovs_interfaceid": "b8dc67a8-c070-49a3-af75-8091871a2e25", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 687.224751] env[59490]: DEBUG oslo_concurrency.lockutils [req-d535c1a7-7515-4dde-9417-530d6542c345 req-7a8ded58-6a50-492d-a49a-07c014c0ef60 service nova] Releasing lock "refresh_cache-ad8223ea-b097-439f-bcff-9c06bd1cf5e6" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 687.252792] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707398, 'name': CreateVM_Task, 'duration_secs': 0.284079} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 687.253064] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 687.254051] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 687.254466] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 687.254940] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 687.257551] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ea2b986-d7dc-4e5d-a7e5-477e1560d89b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 687.262990] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Waiting for the task: (returnval){ [ 687.262990] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]520c0eb5-ae12-e773-9634-79997e2159f9" [ 687.262990] env[59490]: _type = "Task" [ 687.262990] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 687.272637] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]520c0eb5-ae12-e773-9634-79997e2159f9, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 687.780012] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 687.780012] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 687.780549] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 688.127220] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Successfully updated port: 81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 688.136958] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 688.137115] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquired lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 688.137262] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 688.268152] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 688.628100] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Successfully updated port: b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 688.646192] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 688.646192] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquired lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 688.647220] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 688.820597] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 689.059587] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Updating instance_info_cache with network_info: [{"id": "81b838dd-6028-40eb-ad00-c1499bff521a", "address": "fa:16:3e:a7:62:32", "network": {"id": "92c77840-0155-4149-9868-219f9dfffc30", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-665982562-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "871a7d77e44d42e88d6e3dfffd9a6c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81b838dd-60", "ovs_interfaceid": "81b838dd-6028-40eb-ad00-c1499bff521a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.075917] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Releasing lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 689.076461] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance network_info: |[{"id": "81b838dd-6028-40eb-ad00-c1499bff521a", "address": "fa:16:3e:a7:62:32", "network": {"id": "92c77840-0155-4149-9868-219f9dfffc30", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-665982562-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "871a7d77e44d42e88d6e3dfffd9a6c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81b838dd-60", "ovs_interfaceid": "81b838dd-6028-40eb-ad00-c1499bff521a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 689.077137] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a7:62:32', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '0cc30a16-f070-421c-964e-50c9aa32f17a', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '81b838dd-6028-40eb-ad00-c1499bff521a', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 689.085135] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Creating folder: Project (871a7d77e44d42e88d6e3dfffd9a6c5f). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.087451] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-1e3da829-7346-4fb1-9b22-c3509be0e8af {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.101464] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Created folder: Project (871a7d77e44d42e88d6e3dfffd9a6c5f) in parent group-v168905. [ 689.101636] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Creating folder: Instances. Parent ref: group-v168943. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.101874] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fe6ae0d9-f055-4256-9b2d-95c4fedbc569 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.112193] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Created folder: Instances in parent group-v168943. [ 689.112411] env[59490]: DEBUG oslo.service.loopingcall [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 689.112926] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 689.112926] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a7c2fe9a-48a0-4ab8-bccb-187313862594 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.135750] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 689.135750] env[59490]: value = "task-707401" [ 689.135750] env[59490]: _type = "Task" [ 689.135750] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.145308] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707401, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.404584] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Updating instance_info_cache with network_info: [{"id": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "address": "fa:16:3e:1d:73:0e", "network": {"id": "16708a5c-3beb-415f-8c1a-79465460b7de", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-462945539-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e0f71e94488943f9b7989439be3152d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0275025-62", "ovs_interfaceid": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 689.419015] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Releasing lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 689.419015] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance network_info: |[{"id": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "address": "fa:16:3e:1d:73:0e", "network": {"id": "16708a5c-3beb-415f-8c1a-79465460b7de", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-462945539-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e0f71e94488943f9b7989439be3152d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0275025-62", "ovs_interfaceid": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 689.419969] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1d:73:0e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'a9abd00f-2cea-40f8-9804-a56b6431192d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b0275025-626e-4293-bb18-a14ae7ed9ca5', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 689.428144] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Creating folder: Project (e0f71e94488943f9b7989439be3152d5). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.428964] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-ecea1a19-50de-4bcf-9364-90927b611e2d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.441113] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Created folder: Project (e0f71e94488943f9b7989439be3152d5) in parent group-v168905. [ 689.441113] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Creating folder: Instances. Parent ref: group-v168946. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 689.441113] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8f76c8e-d5df-49ad-94a3-5ce2c35fa216 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.450197] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Created folder: Instances in parent group-v168946. [ 689.450753] env[59490]: DEBUG oslo.service.loopingcall [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 689.451085] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 689.451400] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3d87bfed-753f-4b1e-9ff8-223ba351a8f8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.476112] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 689.476112] env[59490]: value = "task-707404" [ 689.476112] env[59490]: _type = "Task" [ 689.476112] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.482950] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707404, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.647477] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707401, 'name': CreateVM_Task, 'duration_secs': 0.309984} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 689.647916] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 689.648743] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 689.649087] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 689.650538] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 689.650935] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2773e5bd-9a62-4b4e-bef6-38df11d3b4d9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 689.656372] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Waiting for the task: (returnval){ [ 689.656372] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52e25544-5c85-ac85-4291-64503b485099" [ 689.656372] env[59490]: _type = "Task" [ 689.656372] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 689.666141] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52e25544-5c85-ac85-4291-64503b485099, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 689.985971] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707404, 'name': CreateVM_Task, 'duration_secs': 0.352032} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 689.986496] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 689.987372] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.168962] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 690.169506] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 690.170073] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.170413] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 690.170893] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 690.171290] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-96e828d2-010f-4c89-93a6-fce0b813db6f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 690.179072] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Waiting for the task: (returnval){ [ 690.179072] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5240ad83-0344-5c3e-fec9-6557a610b24b" [ 690.179072] env[59490]: _type = "Task" [ 690.179072] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 690.188556] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5240ad83-0344-5c3e-fec9-6557a610b24b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 690.238076] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Received event network-changed-72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 690.238410] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Refreshing instance network info cache due to event network-changed-72ccc8c5-8b84-4ce5-a12a-16920839c294. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 690.238656] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquiring lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.238816] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquired lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 690.239015] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Refreshing network info cache for port 72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 690.690481] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 690.690824] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 690.690880] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 690.844113] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "504e16b8-70d2-437f-ab3e-7631cb74abec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.847202] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "504e16b8-70d2-437f-ab3e-7631cb74abec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 690.883544] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "12082268-a4a2-4eb8-9adc-93c7e7d82c42" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 690.883776] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "12082268-a4a2-4eb8-9adc-93c7e7d82c42" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.046203] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Updated VIF entry in instance network info cache for port 72ccc8c5-8b84-4ce5-a12a-16920839c294. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 691.046978] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Updating instance_info_cache with network_info: [{"id": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "address": "fa:16:3e:ab:87:b1", "network": {"id": "6b17303f-fe3c-438f-8ffa-458e67f9a924", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-759847990-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "efca172fc0d849a1bda4106da1369768", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap72ccc8c5-8b", "ovs_interfaceid": "72ccc8c5-8b84-4ce5-a12a-16920839c294", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 691.059682] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Releasing lock "refresh_cache-67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 691.059682] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Received event network-vif-plugged-81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 691.059682] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquiring lock "31207de9-e903-4ed4-bccc-c0796edec34b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.059847] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Lock "31207de9-e903-4ed4-bccc-c0796edec34b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.060154] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Lock "31207de9-e903-4ed4-bccc-c0796edec34b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.060154] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] No waiting events found dispatching network-vif-plugged-81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 691.060469] env[59490]: WARNING nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Received unexpected event network-vif-plugged-81b838dd-6028-40eb-ad00-c1499bff521a for instance with vm_state building and task_state spawning. [ 691.060469] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Received event network-vif-plugged-b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 691.060593] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquiring lock "2f083456-3eb9-4022-86a3-8d39f83c470f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 691.060739] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Lock "2f083456-3eb9-4022-86a3-8d39f83c470f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 691.060884] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Lock "2f083456-3eb9-4022-86a3-8d39f83c470f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 691.061040] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] No waiting events found dispatching network-vif-plugged-b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 691.065415] env[59490]: WARNING nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Received unexpected event network-vif-plugged-b0275025-626e-4293-bb18-a14ae7ed9ca5 for instance with vm_state building and task_state spawning. [ 691.065415] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Received event network-changed-81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 691.065415] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Refreshing instance network info cache due to event network-changed-81b838dd-6028-40eb-ad00-c1499bff521a. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 691.065415] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquiring lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 691.065415] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquired lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 691.065984] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Refreshing network info cache for port 81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 692.121020] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Updated VIF entry in instance network info cache for port 81b838dd-6028-40eb-ad00-c1499bff521a. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 692.121518] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Updating instance_info_cache with network_info: [{"id": "81b838dd-6028-40eb-ad00-c1499bff521a", "address": "fa:16:3e:a7:62:32", "network": {"id": "92c77840-0155-4149-9868-219f9dfffc30", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-665982562-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "871a7d77e44d42e88d6e3dfffd9a6c5f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "0cc30a16-f070-421c-964e-50c9aa32f17a", "external-id": "nsx-vlan-transportzone-424", "segmentation_id": 424, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap81b838dd-60", "ovs_interfaceid": "81b838dd-6028-40eb-ad00-c1499bff521a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 692.140442] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Releasing lock "refresh_cache-31207de9-e903-4ed4-bccc-c0796edec34b" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 692.140813] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Received event network-changed-b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 692.141019] env[59490]: DEBUG nova.compute.manager [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Refreshing instance network info cache due to event network-changed-b0275025-626e-4293-bb18-a14ae7ed9ca5. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 692.141335] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquiring lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 692.141506] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Acquired lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 692.141689] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Refreshing network info cache for port b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 692.360404] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9ac777b5-df98-4a01-9282-2b266d082fd4 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "de66fe1b-8f03-4a10-af9d-302cb5021f79" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 692.360623] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9ac777b5-df98-4a01-9282-2b266d082fd4 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "de66fe1b-8f03-4a10-af9d-302cb5021f79" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 693.344524] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Updated VIF entry in instance network info cache for port b0275025-626e-4293-bb18-a14ae7ed9ca5. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 693.344524] env[59490]: DEBUG nova.network.neutron [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Updating instance_info_cache with network_info: [{"id": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "address": "fa:16:3e:1d:73:0e", "network": {"id": "16708a5c-3beb-415f-8c1a-79465460b7de", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-462945539-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e0f71e94488943f9b7989439be3152d5", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "a9abd00f-2cea-40f8-9804-a56b6431192d", "external-id": "nsx-vlan-transportzone-639", "segmentation_id": 639, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb0275025-62", "ovs_interfaceid": "b0275025-626e-4293-bb18-a14ae7ed9ca5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 693.361818] env[59490]: DEBUG oslo_concurrency.lockutils [req-aa84b87d-0daf-4942-9d19-e929eb7cfa3a req-4f386ccd-2217-451a-a18b-6b8c5bad8478 service nova] Releasing lock "refresh_cache-2f083456-3eb9-4022-86a3-8d39f83c470f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 694.191357] env[59490]: DEBUG oslo_concurrency.lockutils [None req-911027a7-e39b-481a-816b-35a39c0d4c61 tempest-ServersV294TestFqdnHostnames-1291042165 tempest-ServersV294TestFqdnHostnames-1291042165-project-member] Acquiring lock "b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 694.191700] env[59490]: DEBUG oslo_concurrency.lockutils [None req-911027a7-e39b-481a-816b-35a39c0d4c61 tempest-ServersV294TestFqdnHostnames-1291042165 tempest-ServersV294TestFqdnHostnames-1291042165-project-member] Lock "b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.291454] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b84f1516-ea7a-4dcc-adff-48af9aaea268 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "c1a44e57-5a85-475c-898e-9f30e0c6b492" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.291729] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b84f1516-ea7a-4dcc-adff-48af9aaea268 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "c1a44e57-5a85-475c-898e-9f30e0c6b492" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 695.364564] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1e98a9f3-ec85-41f4-a911-c9cb66091dbd tempest-SecurityGroupsTestJSON-19223672 tempest-SecurityGroupsTestJSON-19223672-project-member] Acquiring lock "2cbf0a49-2835-41c2-8840-9515f1e95d5a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 695.366024] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1e98a9f3-ec85-41f4-a911-c9cb66091dbd tempest-SecurityGroupsTestJSON-19223672 tempest-SecurityGroupsTestJSON-19223672-project-member] Lock "2cbf0a49-2835-41c2-8840-9515f1e95d5a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 696.864355] env[59490]: DEBUG oslo_concurrency.lockutils [None req-60684222-8bfa-400a-8103-572d32c227f1 tempest-ServerActionsTestOtherB-715024718 tempest-ServerActionsTestOtherB-715024718-project-member] Acquiring lock "d5b78319-88f9-4771-8b14-e833d05eb3d6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 696.864977] env[59490]: DEBUG oslo_concurrency.lockutils [None req-60684222-8bfa-400a-8103-572d32c227f1 tempest-ServerActionsTestOtherB-715024718 tempest-ServerActionsTestOtherB-715024718-project-member] Lock "d5b78319-88f9-4771-8b14-e833d05eb3d6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 698.359528] env[59490]: DEBUG oslo_concurrency.lockutils [None req-8e2b81fd-877f-49e5-8e94-747b25a8b66a tempest-ServerGroupTestJSON-200953235 tempest-ServerGroupTestJSON-200953235-project-member] Acquiring lock "cfe59672-be1d-43ed-b8d4-b5ed51e08a34" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 698.360090] env[59490]: DEBUG oslo_concurrency.lockutils [None req-8e2b81fd-877f-49e5-8e94-747b25a8b66a tempest-ServerGroupTestJSON-200953235 tempest-ServerGroupTestJSON-200953235-project-member] Lock "cfe59672-be1d-43ed-b8d4-b5ed51e08a34" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 705.384993] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 705.385309] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 706.384543] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 706.384735] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 706.397487] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.397810] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.397852] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 706.397985] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 706.399178] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4cf5849-5f2d-4aa0-a5cf-c1243ca96c7d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.408156] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-816ddb4d-5c02-4e39-9d33-6293736f9ba1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.421360] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4955ccd5-ea61-435f-bb99-1f0d0b555d27 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.427513] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11b41af2-0f1f-4627-9e0d-760054a18af6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.455841] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181632MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 706.455841] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 706.456025] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 706.541661] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.541661] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 08923fae-e356-444d-b221-b40576b54af9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.541661] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 398edc73-9487-4365-9e55-6eaa1f530f64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.541828] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 71698ce4-94a0-442c-8081-374616ce2ac4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.541883] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.542043] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.542181] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.542324] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.542449] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 31207de9-e903-4ed4-bccc-c0796edec34b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.542593] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2f083456-3eb9-4022-86a3-8d39f83c470f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 706.567288] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 581848be-38fb-42da-b723-480bf297d1a5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.591018] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.601757] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3464c5af-60a4-4b6d-b7ca-51cf7312cf09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.613112] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 504e16b8-70d2-437f-ab3e-7631cb74abec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.624023] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 12082268-a4a2-4eb8-9adc-93c7e7d82c42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.633535] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance de66fe1b-8f03-4a10-af9d-302cb5021f79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.643895] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.653548] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance c1a44e57-5a85-475c-898e-9f30e0c6b492 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.664384] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2cbf0a49-2835-41c2-8840-9515f1e95d5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.675153] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d5b78319-88f9-4771-8b14-e833d05eb3d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.704869] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance cfe59672-be1d-43ed-b8d4-b5ed51e08a34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 706.705139] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 706.705285] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 706.947496] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-595ee477-ce9c-4dd7-9238-b1093680bf00 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.955089] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ee5ea34-695b-4744-a992-4521ecb979dc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.983914] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5040b7f3-c2a3-4ecc-86d7-392b4ed2637f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 706.991039] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0629fc1d-ef8c-4cf4-949f-cbaaa8315a4f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 707.004886] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 707.012951] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 707.028128] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 707.028297] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.572s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 708.023641] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.023901] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.024059] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.383556] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 708.383716] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 708.385047] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 708.403549] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.403699] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 08923fae-e356-444d-b221-b40576b54af9] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.403821] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.403938] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404108] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404237] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404351] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404468] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404579] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404692] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 708.404806] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 708.405283] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 709.384578] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 718.267428] env[59490]: WARNING oslo_vmware.rw_handles [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 718.267428] env[59490]: ERROR oslo_vmware.rw_handles [ 718.268149] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 718.269483] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 718.269733] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Copying Virtual Disk [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/5193073a-1580-4b55-a41a-4e9d659bcd61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 718.270055] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f95633b0-a4e3-4cfb-a4cb-1896937b1d88 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.277737] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Waiting for the task: (returnval){ [ 718.277737] env[59490]: value = "task-707405" [ 718.277737] env[59490]: _type = "Task" [ 718.277737] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.285244] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Task: {'id': task-707405, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 718.788095] env[59490]: DEBUG oslo_vmware.exceptions [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 718.788095] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 718.788551] env[59490]: ERROR nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.788551] env[59490]: Faults: ['InvalidArgument'] [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Traceback (most recent call last): [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] yield resources [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self.driver.spawn(context, instance, image_meta, [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self._fetch_image_if_missing(context, vi) [ 718.788551] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] image_cache(vi, tmp_image_ds_loc) [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] vm_util.copy_virtual_disk( [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] session._wait_for_task(vmdk_copy_task) [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return self.wait_for_task(task_ref) [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return evt.wait() [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] result = hub.switch() [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 718.788897] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return self.greenlet.switch() [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self.f(*self.args, **self.kw) [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] raise exceptions.translate_fault(task_info.error) [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Faults: ['InvalidArgument'] [ 718.789286] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] [ 718.789286] env[59490]: INFO nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Terminating instance [ 718.790432] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 718.790634] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 718.790859] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f1939cb5-939d-49fc-81ec-433a3935c599 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.793040] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 718.793230] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 718.793942] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8371bdb1-39b2-436e-9553-e3a9eb3dff2a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.801150] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 718.801379] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-08bba017-f53b-4658-be41-80c14719c56b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.803596] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 718.803717] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 718.804917] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8a64b793-407b-40ce-9e2e-1b0be8f9aabe {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.809658] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 718.809658] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52f6409a-ffd5-8d95-5796-7bd4389b3b1d" [ 718.809658] env[59490]: _type = "Task" [ 718.809658] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.816940] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52f6409a-ffd5-8d95-5796-7bd4389b3b1d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 718.873440] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 718.873651] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 718.873861] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Deleting the datastore file [datastore2] 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 718.874132] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4db77390-7393-4243-9d9b-55be78f7bc01 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 718.880574] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Waiting for the task: (returnval){ [ 718.880574] env[59490]: value = "task-707407" [ 718.880574] env[59490]: _type = "Task" [ 718.880574] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 718.888084] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Task: {'id': task-707407, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 719.319677] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 719.320021] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating directory with path [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 719.320100] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ca5bde33-f917-46a1-8253-55d5fd79c6c3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.331171] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created directory with path [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 719.331364] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Fetch image to [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 719.331527] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 719.332270] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3a1c6e7-064e-4bbe-9b23-a4e344828c84 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.338848] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40163152-bee9-40e5-812a-6e56cade1ce6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.347615] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43374231-16b9-4bc1-be30-72a0d2bff129 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.378805] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc6b0429-9de4-4c4d-814a-ca5bced223c1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.390144] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-433ce9ee-92c9-4ac8-981c-7e2ec96ac865 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.391810] env[59490]: DEBUG oslo_vmware.api [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Task: {'id': task-707407, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065741} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 719.392069] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 719.392249] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 719.392407] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 719.392572] env[59490]: INFO nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Took 0.60 seconds to destroy the instance on the hypervisor. [ 719.395962] env[59490]: DEBUG nova.compute.claims [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 719.396251] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 719.396593] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 719.416544] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 719.468266] env[59490]: DEBUG oslo_vmware.rw_handles [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 719.521305] env[59490]: DEBUG oslo_vmware.rw_handles [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 719.521481] env[59490]: DEBUG oslo_vmware.rw_handles [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 719.721054] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-961e43b1-370c-4a80-8d07-ff4c6de44667 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.728693] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8460c85-2b35-46ed-bfad-1327f5e63b57 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.757383] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0de7061c-dd56-4e68-b6a6-3fe08f0d6f9f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.763977] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c49ecf4-71fb-45de-93ac-ff162957ff1d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 719.777202] env[59490]: DEBUG nova.compute.provider_tree [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 719.785368] env[59490]: DEBUG nova.scheduler.client.report [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 719.798349] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.402s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 719.798896] env[59490]: ERROR nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 719.798896] env[59490]: Faults: ['InvalidArgument'] [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Traceback (most recent call last): [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self.driver.spawn(context, instance, image_meta, [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self._vmops.spawn(context, instance, image_meta, injected_files, [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self._fetch_image_if_missing(context, vi) [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] image_cache(vi, tmp_image_ds_loc) [ 719.798896] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] vm_util.copy_virtual_disk( [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] session._wait_for_task(vmdk_copy_task) [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return self.wait_for_task(task_ref) [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return evt.wait() [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] result = hub.switch() [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] return self.greenlet.switch() [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 719.799284] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] self.f(*self.args, **self.kw) [ 719.799646] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 719.799646] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] raise exceptions.translate_fault(task_info.error) [ 719.799646] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 719.799646] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Faults: ['InvalidArgument'] [ 719.799646] env[59490]: ERROR nova.compute.manager [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] [ 719.799646] env[59490]: DEBUG nova.compute.utils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 719.801271] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Build of instance 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8 was re-scheduled: A specified parameter was not correct: fileType [ 719.801271] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 719.801635] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 719.801812] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 719.801978] env[59490]: DEBUG nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 719.802148] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 720.058158] env[59490]: DEBUG nova.network.neutron [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 720.074050] env[59490]: INFO nova.compute.manager [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] [instance: 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8] Took 0.27 seconds to deallocate network for instance. [ 720.153619] env[59490]: INFO nova.scheduler.client.report [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Deleted allocations for instance 6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8 [ 720.170969] env[59490]: DEBUG oslo_concurrency.lockutils [None req-bd0cd455-f391-4731-9e34-f006eb9a9ff3 tempest-ServerRescueNegativeTestJSON-2094913395 tempest-ServerRescueNegativeTestJSON-2094913395-project-member] Lock "6de1a2fb-d115-4fd6-9ae2-c6e2f841cbf8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 64.572s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.186265] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 720.234642] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 720.234880] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 720.236310] env[59490]: INFO nova.compute.claims [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 720.494358] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5aceaadd-71e8-454b-86e8-35402e842701 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.502818] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfe675c9-e7a8-413d-b991-3febdeb7541a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.533145] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9e0df15-d61c-454f-908f-1b574a577ff8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.540655] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abb90e58-5c14-428f-b7f2-94a9d0a324e6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.553506] env[59490]: DEBUG nova.compute.provider_tree [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 720.564395] env[59490]: DEBUG nova.scheduler.client.report [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 720.576811] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.342s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 720.577981] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 720.608486] env[59490]: DEBUG nova.compute.utils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 720.609852] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 720.610031] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 720.618311] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 720.683516] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 720.691662] env[59490]: DEBUG nova.policy [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '98211b3d4e484897b629a4e191177188', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98b593630f284e60b1d3e7e837a9c858', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 720.712613] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 720.713051] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 720.713051] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 720.713655] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 720.713655] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 720.713655] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 720.713943] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 720.713943] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 720.714065] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 720.714381] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 720.714599] env[59490]: DEBUG nova.virt.hardware [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 720.715470] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7897a1b-0467-4371-acee-f548888bf962 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.725351] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f71d734b-1d50-4f51-9640-9c9ebd995ac1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 720.983679] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Successfully created port: 889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 721.685551] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Successfully updated port: 889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 721.703120] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 721.703120] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquired lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 721.703120] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 721.763780] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 721.991871] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Updating instance_info_cache with network_info: [{"id": "889c28a1-e5e9-46f2-8d60-a4416d197765", "address": "fa:16:3e:26:7a:f6", "network": {"id": "d0b9487e-7988-4329-aa2c-2e14d698f07d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-9240153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98b593630f284e60b1d3e7e837a9c858", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15165046-2de9-4ada-9e99-0126e20854a9", "external-id": "nsx-vlan-transportzone-974", "segmentation_id": 974, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap889c28a1-e5", "ovs_interfaceid": "889c28a1-e5e9-46f2-8d60-a4416d197765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.006149] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Releasing lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 722.006149] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance network_info: |[{"id": "889c28a1-e5e9-46f2-8d60-a4416d197765", "address": "fa:16:3e:26:7a:f6", "network": {"id": "d0b9487e-7988-4329-aa2c-2e14d698f07d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-9240153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98b593630f284e60b1d3e7e837a9c858", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15165046-2de9-4ada-9e99-0126e20854a9", "external-id": "nsx-vlan-transportzone-974", "segmentation_id": 974, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap889c28a1-e5", "ovs_interfaceid": "889c28a1-e5e9-46f2-8d60-a4416d197765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 722.006270] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:26:7a:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '15165046-2de9-4ada-9e99-0126e20854a9', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '889c28a1-e5e9-46f2-8d60-a4416d197765', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 722.013840] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Creating folder: Project (98b593630f284e60b1d3e7e837a9c858). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.016026] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-045dcc90-10b5-4a0b-bfa7-6e0349986bff {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.027341] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Created folder: Project (98b593630f284e60b1d3e7e837a9c858) in parent group-v168905. [ 722.029018] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Creating folder: Instances. Parent ref: group-v168949. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 722.029018] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-48677bed-db61-47bb-aa53-f84ec2fbd7bd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.039016] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Created folder: Instances in parent group-v168949. [ 722.039016] env[59490]: DEBUG oslo.service.loopingcall [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 722.039016] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 722.039016] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f8a9259a-0a30-4465-b819-5c292602cbc1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.060017] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 722.060017] env[59490]: value = "task-707410" [ 722.060017] env[59490]: _type = "Task" [ 722.060017] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 722.067695] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707410, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.199855] env[59490]: DEBUG nova.compute.manager [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Received event network-vif-plugged-889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 722.199855] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Acquiring lock "581848be-38fb-42da-b723-480bf297d1a5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 722.199855] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Lock "581848be-38fb-42da-b723-480bf297d1a5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 722.199855] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Lock "581848be-38fb-42da-b723-480bf297d1a5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 722.200243] env[59490]: DEBUG nova.compute.manager [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] No waiting events found dispatching network-vif-plugged-889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 722.201026] env[59490]: WARNING nova.compute.manager [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Received unexpected event network-vif-plugged-889c28a1-e5e9-46f2-8d60-a4416d197765 for instance with vm_state building and task_state spawning. [ 722.201026] env[59490]: DEBUG nova.compute.manager [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Received event network-changed-889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 722.201026] env[59490]: DEBUG nova.compute.manager [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Refreshing instance network info cache due to event network-changed-889c28a1-e5e9-46f2-8d60-a4416d197765. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 722.201026] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Acquiring lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.201223] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Acquired lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.201433] env[59490]: DEBUG nova.network.neutron [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Refreshing network info cache for port 889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 722.567662] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707410, 'name': CreateVM_Task, 'duration_secs': 0.284217} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 722.567846] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 722.568493] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 722.568646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 722.568957] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 722.569238] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b8b1996d-a6e5-4438-b3dc-c2ad3b2590ef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 722.573874] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Waiting for the task: (returnval){ [ 722.573874] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]521a401a-a7ea-f76b-52b1-2591829c61da" [ 722.573874] env[59490]: _type = "Task" [ 722.573874] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 722.581988] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]521a401a-a7ea-f76b-52b1-2591829c61da, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 722.834778] env[59490]: DEBUG nova.network.neutron [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Updated VIF entry in instance network info cache for port 889c28a1-e5e9-46f2-8d60-a4416d197765. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 722.834778] env[59490]: DEBUG nova.network.neutron [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Updating instance_info_cache with network_info: [{"id": "889c28a1-e5e9-46f2-8d60-a4416d197765", "address": "fa:16:3e:26:7a:f6", "network": {"id": "d0b9487e-7988-4329-aa2c-2e14d698f07d", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-9240153-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "98b593630f284e60b1d3e7e837a9c858", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "15165046-2de9-4ada-9e99-0126e20854a9", "external-id": "nsx-vlan-transportzone-974", "segmentation_id": 974, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap889c28a1-e5", "ovs_interfaceid": "889c28a1-e5e9-46f2-8d60-a4416d197765", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 722.850731] env[59490]: DEBUG oslo_concurrency.lockutils [req-4aaf5e66-dae6-4e22-84c1-2ea0ef53057b req-7800e4a4-4f17-4e38-a282-c5cfeacd9d14 service nova] Releasing lock "refresh_cache-581848be-38fb-42da-b723-480bf297d1a5" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.084248] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 723.084248] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 723.084248] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 765.385064] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 765.385468] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 766.383971] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 766.394013] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.394332] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.394372] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 766.394567] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 766.395644] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5023e8be-80dd-4a4d-a024-f67f867738b0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.404746] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d4e379-ae6e-4890-b65e-befda7e858f3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.418447] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-baafcc67-e3c1-4609-9017-d11a3004da60 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.424570] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2087a800-40d2-4f2c-ac19-753459d42499 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.454433] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181609MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 766.454572] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 766.454745] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 766.526684] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 08923fae-e356-444d-b221-b40576b54af9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.526862] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 398edc73-9487-4365-9e55-6eaa1f530f64 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.526949] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 71698ce4-94a0-442c-8081-374616ce2ac4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527076] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527190] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527301] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527410] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527514] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 31207de9-e903-4ed4-bccc-c0796edec34b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527619] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2f083456-3eb9-4022-86a3-8d39f83c470f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.527725] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 581848be-38fb-42da-b723-480bf297d1a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 766.542903] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.553012] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3464c5af-60a4-4b6d-b7ca-51cf7312cf09 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.562465] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 504e16b8-70d2-437f-ab3e-7631cb74abec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.572154] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 12082268-a4a2-4eb8-9adc-93c7e7d82c42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.581082] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance de66fe1b-8f03-4a10-af9d-302cb5021f79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.590309] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.599437] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance c1a44e57-5a85-475c-898e-9f30e0c6b492 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.608191] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2cbf0a49-2835-41c2-8840-9515f1e95d5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.617070] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d5b78319-88f9-4771-8b14-e833d05eb3d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.627030] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance cfe59672-be1d-43ed-b8d4-b5ed51e08a34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 766.627144] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 766.627219] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 766.856119] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd7cfb52-b985-4ca3-aed5-79b44c082745 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.863856] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b80c80-f8e0-47d0-ba4a-d7ff8d1d70af {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.894011] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48f05656-5cde-4298-b99c-ea48c1f068aa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.901572] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2d179ea-abfb-461a-83c3-d5c4d3ed55f9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 766.914160] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 766.922287] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 766.935309] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 766.935486] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.481s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 767.930548] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 767.954216] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 767.954216] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 767.954216] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 768.383873] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 768.384248] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.384047] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.384408] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 769.384408] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 769.397703] env[59490]: WARNING oslo_vmware.rw_handles [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 769.397703] env[59490]: ERROR oslo_vmware.rw_handles [ 769.397703] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 769.399440] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 769.399677] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Copying Virtual Disk [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/1d6fbda2-19a9-46d4-9579-5f21d52279fb/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 769.399932] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-069ccd57-8952-4958-be3f-ae0dd5ec9003 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.405052] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 08923fae-e356-444d-b221-b40576b54af9] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405187] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405313] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405438] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405558] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405674] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405791] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.405906] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.406030] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.406148] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 769.406261] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 769.407108] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 769.411709] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 769.411709] env[59490]: value = "task-707411" [ 769.411709] env[59490]: _type = "Task" [ 769.411709] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 769.420201] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707411, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 769.922177] env[59490]: DEBUG oslo_vmware.exceptions [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 769.922583] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 769.923155] env[59490]: ERROR nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.923155] env[59490]: Faults: ['InvalidArgument'] [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] Traceback (most recent call last): [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] yield resources [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self.driver.spawn(context, instance, image_meta, [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self._fetch_image_if_missing(context, vi) [ 769.923155] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] image_cache(vi, tmp_image_ds_loc) [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] vm_util.copy_virtual_disk( [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] session._wait_for_task(vmdk_copy_task) [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return self.wait_for_task(task_ref) [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return evt.wait() [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] result = hub.switch() [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 769.923522] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return self.greenlet.switch() [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self.f(*self.args, **self.kw) [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] raise exceptions.translate_fault(task_info.error) [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] Faults: ['InvalidArgument'] [ 769.923886] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] [ 769.923886] env[59490]: INFO nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Terminating instance [ 769.925024] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 769.925223] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 769.925441] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7d2ae792-2eac-4377-b4b8-f6d31371015b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.928029] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 769.928193] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 769.928950] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bcee0a73-1c4a-48ed-95f2-e4664bdffbd5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.932846] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 769.933014] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 769.935423] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-298beb34-910b-4d1b-8676-961028f9f0ab {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.937564] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 769.937765] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-af083a70-9a55-4a0e-bd84-90756c99a754 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 769.941682] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 769.941682] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52cde4e2-672a-7fe8-c052-00ed3632b79c" [ 769.941682] env[59490]: _type = "Task" [ 769.941682] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 769.955049] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52cde4e2-672a-7fe8-c052-00ed3632b79c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 770.002703] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 770.002703] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 770.002703] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Deleting the datastore file [datastore2] 08923fae-e356-444d-b221-b40576b54af9 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 770.002703] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-06e30c9e-ad45-4e58-ba27-0373370f64a9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.008890] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 770.008890] env[59490]: value = "task-707413" [ 770.008890] env[59490]: _type = "Task" [ 770.008890] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 770.016043] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707413, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 770.453069] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 770.453351] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 770.453556] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d9af6b28-5013-4a20-b39b-c34d953bb96c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.466932] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 770.467134] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Fetch image to [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 770.467299] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 770.468046] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00fce624-3f9f-4d67-9bdf-4f97cd44c8f3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.474896] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8de99e25-333d-4eb1-91bc-f37c02fbf218 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.483562] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98a21183-9f55-4956-8ae9-523e2dee6dbf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.516374] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7917370-f490-472c-b709-62518482b492 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.524592] env[59490]: DEBUG oslo_vmware.api [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707413, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078875} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 770.525112] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 770.525313] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 770.525484] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 770.525647] env[59490]: INFO nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Took 0.60 seconds to destroy the instance on the hypervisor. [ 770.527202] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a9b4a739-506f-4516-b137-4be144749c58 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.529207] env[59490]: DEBUG nova.compute.claims [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 770.529516] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 770.529596] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 770.625049] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 770.669416] env[59490]: DEBUG oslo_vmware.rw_handles [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 770.723745] env[59490]: DEBUG oslo_vmware.rw_handles [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 770.723911] env[59490]: DEBUG oslo_vmware.rw_handles [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 770.838850] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f1b3de4-17c2-48fe-86a4-19f2c98837ad {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.846357] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-115b2158-2a97-49ee-9de8-24a2cde4bcda {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.876321] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f0eb41-6356-4eaa-be46-47e72f0af422 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.882914] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd955d06-04f3-4ab9-b667-132ebaa486d9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 770.895093] env[59490]: DEBUG nova.compute.provider_tree [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 770.903033] env[59490]: DEBUG nova.scheduler.client.report [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 770.915986] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.386s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 770.916580] env[59490]: ERROR nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 770.916580] env[59490]: Faults: ['InvalidArgument'] [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] Traceback (most recent call last): [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self.driver.spawn(context, instance, image_meta, [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self._fetch_image_if_missing(context, vi) [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] image_cache(vi, tmp_image_ds_loc) [ 770.916580] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] vm_util.copy_virtual_disk( [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] session._wait_for_task(vmdk_copy_task) [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return self.wait_for_task(task_ref) [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return evt.wait() [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] result = hub.switch() [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] return self.greenlet.switch() [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 770.916952] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] self.f(*self.args, **self.kw) [ 770.917348] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 770.917348] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] raise exceptions.translate_fault(task_info.error) [ 770.917348] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 770.917348] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] Faults: ['InvalidArgument'] [ 770.917348] env[59490]: ERROR nova.compute.manager [instance: 08923fae-e356-444d-b221-b40576b54af9] [ 770.917348] env[59490]: DEBUG nova.compute.utils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 770.919881] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Build of instance 08923fae-e356-444d-b221-b40576b54af9 was re-scheduled: A specified parameter was not correct: fileType [ 770.919881] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 770.920253] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 770.920414] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 770.920571] env[59490]: DEBUG nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 770.920723] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 771.186978] env[59490]: DEBUG nova.network.neutron [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 771.204515] env[59490]: INFO nova.compute.manager [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 08923fae-e356-444d-b221-b40576b54af9] Took 0.28 seconds to deallocate network for instance. [ 771.297767] env[59490]: INFO nova.scheduler.client.report [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Deleted allocations for instance 08923fae-e356-444d-b221-b40576b54af9 [ 771.319320] env[59490]: DEBUG oslo_concurrency.lockutils [None req-32fca1e6-f910-4af9-a7d8-c8f2c2454b7c tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "08923fae-e356-444d-b221-b40576b54af9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 113.857s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.339477] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 771.391579] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 771.391818] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 771.393261] env[59490]: INFO nova.compute.claims [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 771.659147] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1523ce0-7a98-43ef-8467-d4e0bd735a73 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 771.666870] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-429197df-f838-41b2-bc2b-27489ac889fa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 771.697144] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d3cc48f-9821-4c07-bb95-b482eca94420 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 771.704571] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a120c12-fc66-40b5-8343-2ea8a1c8b6f4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 771.717579] env[59490]: DEBUG nova.compute.provider_tree [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 771.725653] env[59490]: DEBUG nova.scheduler.client.report [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 771.738299] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.346s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 771.738718] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 771.768202] env[59490]: DEBUG nova.compute.utils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 771.769419] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 771.769588] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 771.777353] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 771.822784] env[59490]: DEBUG nova.policy [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8a8165d64e0e48b19916b0a8af4fc762', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '473657318cbd48ec93632775a3ee950a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 771.838399] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 771.861258] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 771.861258] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 771.861258] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 771.861399] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 771.861399] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 771.861399] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 771.861399] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 771.861399] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 771.861532] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 771.861739] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 771.862030] env[59490]: DEBUG nova.virt.hardware [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 771.862933] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1faad62b-3675-4932-828c-4ee54272f805 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 771.871136] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2155b062-5c3d-4e15-b2da-edffe318dea2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 772.202024] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Successfully created port: b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 773.112581] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Successfully updated port: b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 773.120962] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 773.121119] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquired lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 773.121259] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 773.172689] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 773.401200] env[59490]: DEBUG nova.compute.manager [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Received event network-vif-plugged-b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 773.401413] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Acquiring lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 773.401601] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 773.401747] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 773.401885] env[59490]: DEBUG nova.compute.manager [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] No waiting events found dispatching network-vif-plugged-b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 773.402284] env[59490]: WARNING nova.compute.manager [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Received unexpected event network-vif-plugged-b662ce79-2d96-4f63-9fda-6795b5a9e8ca for instance with vm_state building and task_state spawning. [ 773.402522] env[59490]: DEBUG nova.compute.manager [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Received event network-changed-b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 773.402679] env[59490]: DEBUG nova.compute.manager [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Refreshing instance network info cache due to event network-changed-b662ce79-2d96-4f63-9fda-6795b5a9e8ca. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 773.402840] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Acquiring lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 773.407076] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Updating instance_info_cache with network_info: [{"id": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "address": "fa:16:3e:e0:6e:b8", "network": {"id": "398c84c9-303b-4d16-b2e3-5ef256fed433", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1267295314-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "473657318cbd48ec93632775a3ee950a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb662ce79-2d", "ovs_interfaceid": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.418040] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Releasing lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 773.418040] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance network_info: |[{"id": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "address": "fa:16:3e:e0:6e:b8", "network": {"id": "398c84c9-303b-4d16-b2e3-5ef256fed433", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1267295314-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "473657318cbd48ec93632775a3ee950a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb662ce79-2d", "ovs_interfaceid": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 773.418352] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Acquired lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 773.418352] env[59490]: DEBUG nova.network.neutron [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Refreshing network info cache for port b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 773.419104] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e0:6e:b8', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7654928b-7afe-42e3-a18d-68ecc775cefe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b662ce79-2d96-4f63-9fda-6795b5a9e8ca', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 773.427050] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Creating folder: Project (473657318cbd48ec93632775a3ee950a). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 773.427692] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-31a8bc7a-83c9-4479-8fd8-2862b0f02622 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 773.441380] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Created folder: Project (473657318cbd48ec93632775a3ee950a) in parent group-v168905. [ 773.441539] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Creating folder: Instances. Parent ref: group-v168952. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 773.441758] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8dcf8373-b308-4086-bc4b-1f2267c82a35 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 773.450347] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Created folder: Instances in parent group-v168952. [ 773.450590] env[59490]: DEBUG oslo.service.loopingcall [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 773.450870] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 773.450960] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a77e69b5-4b5b-49f6-a7d4-f99c730f9886 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 773.476122] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 773.476122] env[59490]: value = "task-707416" [ 773.476122] env[59490]: _type = "Task" [ 773.476122] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 773.481047] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707416, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 773.801160] env[59490]: DEBUG nova.network.neutron [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Updated VIF entry in instance network info cache for port b662ce79-2d96-4f63-9fda-6795b5a9e8ca. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 773.801516] env[59490]: DEBUG nova.network.neutron [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Updating instance_info_cache with network_info: [{"id": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "address": "fa:16:3e:e0:6e:b8", "network": {"id": "398c84c9-303b-4d16-b2e3-5ef256fed433", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1267295314-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "473657318cbd48ec93632775a3ee950a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7654928b-7afe-42e3-a18d-68ecc775cefe", "external-id": "cl2-zone-807", "segmentation_id": 807, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb662ce79-2d", "ovs_interfaceid": "b662ce79-2d96-4f63-9fda-6795b5a9e8ca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 773.816116] env[59490]: DEBUG oslo_concurrency.lockutils [req-c07cfd4f-8ea5-4d4d-a632-de9082c311e2 req-13add2a0-cc7e-433b-8412-ad99dafdab1e service nova] Releasing lock "refresh_cache-1c7b3da9-32ab-4aa0-90e3-f27bf5996590" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 773.983513] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707416, 'name': CreateVM_Task, 'duration_secs': 0.292742} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 773.983513] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 773.985023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 773.985023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 773.985023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 773.985023] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-da9b4e67-407b-4195-b83f-6593e5ff9343 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 773.989289] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Waiting for the task: (returnval){ [ 773.989289] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]526d8309-547a-eadf-7bef-cd4ac8acd556" [ 773.989289] env[59490]: _type = "Task" [ 773.989289] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 773.997254] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]526d8309-547a-eadf-7bef-cd4ac8acd556, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 774.500053] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 774.500500] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 774.500500] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 775.388598] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "0ead3d36-7d65-4e6d-be85-a6736acd3802" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 775.388863] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "0ead3d36-7d65-4e6d-be85-a6736acd3802" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 820.327947] env[59490]: WARNING oslo_vmware.rw_handles [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 820.327947] env[59490]: ERROR oslo_vmware.rw_handles [ 820.328565] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 820.330642] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 820.330917] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Copying Virtual Disk [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/ce994440-fd2c-498b-8f4a-f4be0f0899f6/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 820.331257] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9d3ef711-4b3e-4d0b-a74c-150ba8b5b7c8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.339461] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 820.339461] env[59490]: value = "task-707417" [ 820.339461] env[59490]: _type = "Task" [ 820.339461] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 820.347663] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707417, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 820.850978] env[59490]: DEBUG oslo_vmware.exceptions [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 820.851259] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 820.851785] env[59490]: ERROR nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 820.851785] env[59490]: Faults: ['InvalidArgument'] [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Traceback (most recent call last): [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] yield resources [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self.driver.spawn(context, instance, image_meta, [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self._vmops.spawn(context, instance, image_meta, injected_files, [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self._fetch_image_if_missing(context, vi) [ 820.851785] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] image_cache(vi, tmp_image_ds_loc) [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] vm_util.copy_virtual_disk( [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] session._wait_for_task(vmdk_copy_task) [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return self.wait_for_task(task_ref) [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return evt.wait() [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] result = hub.switch() [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 820.852213] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return self.greenlet.switch() [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self.f(*self.args, **self.kw) [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] raise exceptions.translate_fault(task_info.error) [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Faults: ['InvalidArgument'] [ 820.852613] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] [ 820.852613] env[59490]: INFO nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Terminating instance [ 820.853600] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 820.853801] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 820.854035] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4f81d02e-07e4-4e20-8860-31a76c86462b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.856212] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 820.856469] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 820.857287] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0535c76-2c20-4749-a8fd-346f6ef35a2a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.864208] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 820.865189] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7737fbe9-7c26-4c52-a415-31f5d0f02cfd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.866636] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 820.866791] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 820.867464] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b37a331a-3ad9-40ef-9ec8-e952b6b4e1a6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.872209] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 820.872209] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]528ee67f-5a95-b961-f0a9-e4eb9b34f482" [ 820.872209] env[59490]: _type = "Task" [ 820.872209] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 820.879455] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]528ee67f-5a95-b961-f0a9-e4eb9b34f482, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 820.943516] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 820.943711] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 820.943877] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleting the datastore file [datastore2] 398edc73-9487-4365-9e55-6eaa1f530f64 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 820.944150] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-02981c28-f8f0-4cf4-8a78-b5206dc8385a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 820.950801] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 820.950801] env[59490]: value = "task-707419" [ 820.950801] env[59490]: _type = "Task" [ 820.950801] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 820.958430] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707419, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 821.382386] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 821.382655] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating directory with path [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 821.382870] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cc384a9f-ac90-43cd-9fee-3b66d8eba3cb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.393947] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created directory with path [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 821.394210] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Fetch image to [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 821.394334] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 821.395083] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4fa433f-ddb5-4e62-b11a-49d18620da4f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.401659] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1bfd845-4afc-4cad-91eb-5fbc500e1e89 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.410602] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11c896c2-6ef5-4519-9fdb-c3176aa41611 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.440790] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36974672-d790-448e-bc86-2d43e2711a85 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.446459] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c56c34dc-35bc-4e3d-94dc-d2cf591b82fc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.458583] env[59490]: DEBUG oslo_vmware.api [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707419, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061596} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 821.458695] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 821.458786] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 821.458947] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 821.459125] env[59490]: INFO nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Took 0.60 seconds to destroy the instance on the hypervisor. [ 821.461160] env[59490]: DEBUG nova.compute.claims [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 821.461328] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 821.461533] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 821.468398] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 821.518701] env[59490]: DEBUG oslo_vmware.rw_handles [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 821.578503] env[59490]: DEBUG oslo_vmware.rw_handles [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 821.579077] env[59490]: DEBUG oslo_vmware.rw_handles [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 821.772373] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcfbe020-8393-41b1-9109-37e6731d16fd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.779744] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-defb4256-02c0-4226-a2f7-94c76b5230a4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.808476] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-250c31c6-16ea-453f-8798-bd38842bd332 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.815016] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f249eafa-0506-486b-8b2f-28056c41ad7e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 821.827518] env[59490]: DEBUG nova.compute.provider_tree [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 821.835563] env[59490]: DEBUG nova.scheduler.client.report [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 821.848869] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.387s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 821.849406] env[59490]: ERROR nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 821.849406] env[59490]: Faults: ['InvalidArgument'] [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Traceback (most recent call last): [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self.driver.spawn(context, instance, image_meta, [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self._vmops.spawn(context, instance, image_meta, injected_files, [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self._fetch_image_if_missing(context, vi) [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] image_cache(vi, tmp_image_ds_loc) [ 821.849406] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] vm_util.copy_virtual_disk( [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] session._wait_for_task(vmdk_copy_task) [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return self.wait_for_task(task_ref) [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return evt.wait() [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] result = hub.switch() [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] return self.greenlet.switch() [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 821.850027] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] self.f(*self.args, **self.kw) [ 821.850500] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 821.850500] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] raise exceptions.translate_fault(task_info.error) [ 821.850500] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 821.850500] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Faults: ['InvalidArgument'] [ 821.850500] env[59490]: ERROR nova.compute.manager [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] [ 821.850500] env[59490]: DEBUG nova.compute.utils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 821.851486] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Build of instance 398edc73-9487-4365-9e55-6eaa1f530f64 was re-scheduled: A specified parameter was not correct: fileType [ 821.851486] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 821.851842] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 821.852014] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 821.852184] env[59490]: DEBUG nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 821.852362] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 822.144773] env[59490]: DEBUG nova.network.neutron [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 822.158580] env[59490]: INFO nova.compute.manager [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 398edc73-9487-4365-9e55-6eaa1f530f64] Took 0.31 seconds to deallocate network for instance. [ 822.248551] env[59490]: INFO nova.scheduler.client.report [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleted allocations for instance 398edc73-9487-4365-9e55-6eaa1f530f64 [ 822.267026] env[59490]: DEBUG oslo_concurrency.lockutils [None req-92f95e99-195d-4ed4-b4ec-060c027a4080 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "398edc73-9487-4365-9e55-6eaa1f530f64" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 157.320s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 822.288152] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 822.346901] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 822.347595] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 822.348967] env[59490]: INFO nova.compute.claims [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 822.612440] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55029952-573d-4ce2-b2ef-f94ae0a6ff5a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.619760] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-40ba9653-c18c-4fc7-83c6-7e0e51d2aa53 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.648916] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb66532e-f04f-4714-99ef-3ffd488f4180 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.655862] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fe5e366-9a44-4afb-ab01-0ccef242f1cd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.668710] env[59490]: DEBUG nova.compute.provider_tree [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 822.676620] env[59490]: DEBUG nova.scheduler.client.report [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 822.692083] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.345s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 822.692495] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 822.724720] env[59490]: DEBUG nova.compute.utils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 822.726506] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Not allocating networking since 'none' was specified. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1948}} [ 822.734810] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 822.795988] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 822.817354] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 822.817588] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 822.817739] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 822.817913] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 822.818062] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 822.818204] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 822.818401] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 822.818553] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 822.818709] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 822.818861] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 822.819032] env[59490]: DEBUG nova.virt.hardware [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 822.819917] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e753350-a4e9-4b48-af5c-3287ef6a6f3f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.827786] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9871d03-1808-40cb-b765-1cad5f438b6f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.841152] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance VIF info [] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 822.846736] env[59490]: DEBUG oslo.service.loopingcall [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 822.846965] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 822.847179] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-894032b8-c261-4ea8-96b4-308a52894d33 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 822.864522] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 822.864522] env[59490]: value = "task-707420" [ 822.864522] env[59490]: _type = "Task" [ 822.864522] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 822.871874] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707420, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.374461] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707420, 'name': CreateVM_Task, 'duration_secs': 0.237248} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 823.374626] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 823.375046] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 823.375201] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 823.375506] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 823.375722] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-87c5aa97-3b8b-40d9-b48a-a89336564e58 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 823.380043] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 823.380043] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5216e329-4c8d-9a87-0524-f323f95d287f" [ 823.380043] env[59490]: _type = "Task" [ 823.380043] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 823.387110] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5216e329-4c8d-9a87-0524-f323f95d287f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 823.890585] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 823.890585] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 823.890585] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 824.124137] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 824.124377] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 825.384421] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 825.384788] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 825.384843] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 825.384975] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11099}} [ 825.396596] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] There are 0 instances to clean {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 825.396789] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 825.397120] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances with incomplete migration {{(pid=59490) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11137}} [ 825.408375] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 826.413749] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 826.423480] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.423700] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.423857] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 826.424014] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 826.425133] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e51f957-f241-4d03-b545-79edd6404f41 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.434452] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cb97bc0-61f3-4be6-b6e4-8da3e34e579b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.450565] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cc88da3-e2a4-4e62-8d96-905c1a288687 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.456820] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc687cbd-1ae3-42da-9a3b-55c412da1444 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.485205] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181639MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 826.485343] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 826.485519] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 826.549683] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 71698ce4-94a0-442c-8081-374616ce2ac4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.549863] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.549967] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550087] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550202] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550314] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 31207de9-e903-4ed4-bccc-c0796edec34b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550465] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2f083456-3eb9-4022-86a3-8d39f83c470f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550615] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 581848be-38fb-42da-b723-480bf297d1a5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550726] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.550833] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 3464c5af-60a4-4b6d-b7ca-51cf7312cf09 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 826.561517] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 504e16b8-70d2-437f-ab3e-7631cb74abec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.571820] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 12082268-a4a2-4eb8-9adc-93c7e7d82c42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.581233] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance de66fe1b-8f03-4a10-af9d-302cb5021f79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.590395] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.599474] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance c1a44e57-5a85-475c-898e-9f30e0c6b492 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.609662] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2cbf0a49-2835-41c2-8840-9515f1e95d5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.619187] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d5b78319-88f9-4771-8b14-e833d05eb3d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.629088] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance cfe59672-be1d-43ed-b8d4-b5ed51e08a34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.638400] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ead3d36-7d65-4e6d-be85-a6736acd3802 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.647604] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 826.647839] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 826.647985] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 826.899123] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-653247b8-e2be-4bf6-ad81-4e965b8a3b83 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.906179] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-346c8723-3d95-4953-a45f-4800acb5f854 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.943492] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ae2a336-8a0c-447f-a1dc-c603cc024e94 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.952662] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a67c19e4-5f29-450b-9c00-434a2786e190 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 826.966361] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 826.974770] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 826.988094] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 826.988215] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.503s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 827.958640] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 829.379536] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 829.383163] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 829.383340] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 829.383458] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 829.402159] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402329] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402462] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402584] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402705] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402823] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.402938] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.403064] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.403180] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.403293] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 829.403407] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 829.403804] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 829.403962] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 829.404116] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 830.384625] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 869.435481] env[59490]: WARNING oslo_vmware.rw_handles [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 869.435481] env[59490]: ERROR oslo_vmware.rw_handles [ 869.435481] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 869.437398] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 869.437647] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Copying Virtual Disk [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/8d48d5da-a645-4076-ba17-079199121caa/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 869.437928] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-367ffefb-4e3d-4336-a3a9-d25973620f95 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.446235] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 869.446235] env[59490]: value = "task-707421" [ 869.446235] env[59490]: _type = "Task" [ 869.446235] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 869.456527] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': task-707421, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 869.619928] env[59490]: DEBUG nova.compute.manager [req-40e2ea0b-a265-4799-b22d-3389baa77e85 req-81162303-ab8d-4c65-b81b-ab6f8b8de73b service nova] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Received event network-vif-deleted-ec348de0-422c-4320-b593-d676a35120fa {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 869.958608] env[59490]: DEBUG oslo_vmware.exceptions [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 869.958608] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 869.958608] env[59490]: ERROR nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 869.958608] env[59490]: Faults: ['InvalidArgument'] [ 869.958608] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Traceback (most recent call last): [ 869.958608] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 869.958608] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] yield resources [ 869.958608] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 869.958608] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self.driver.spawn(context, instance, image_meta, [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self._fetch_image_if_missing(context, vi) [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] image_cache(vi, tmp_image_ds_loc) [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] vm_util.copy_virtual_disk( [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] session._wait_for_task(vmdk_copy_task) [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return self.wait_for_task(task_ref) [ 869.958991] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return evt.wait() [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] result = hub.switch() [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return self.greenlet.switch() [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self.f(*self.args, **self.kw) [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] raise exceptions.translate_fault(task_info.error) [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Faults: ['InvalidArgument'] [ 869.959400] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] [ 869.959400] env[59490]: INFO nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Terminating instance [ 869.964083] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 869.964083] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 869.964083] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 869.964083] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 869.964285] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 869.964285] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-846e8136-d91c-4723-b81b-fcc2690c1857 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.972062] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 869.974208] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 869.974208] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6a73e481-98d7-4272-9110-17d209f9b5db {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 869.980437] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Waiting for the task: (returnval){ [ 869.980437] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52563611-ce92-b250-b852-6d246ec5a0c7" [ 869.980437] env[59490]: _type = "Task" [ 869.980437] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 869.992527] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52563611-ce92-b250-b852-6d246ec5a0c7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 870.144355] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 870.306284] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 870.320141] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 870.320141] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 870.320141] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 870.320141] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d2e9c4d-82df-4fe2-92ca-f8f8064ce293 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.329609] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 870.329990] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-035c6011-53b8-4506-b161-4c310d838640 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.371525] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 870.371525] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 870.371525] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Deleting the datastore file [datastore2] 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 870.371525] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-43b2dbb5-ba17-4303-b39a-fa5b275cece1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.380210] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 870.380210] env[59490]: value = "task-707423" [ 870.380210] env[59490]: _type = "Task" [ 870.380210] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 870.387841] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': task-707423, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 870.493084] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 870.493084] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Creating directory with path [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 870.493084] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-727bd400-d722-48ec-8df9-3f36c4b0f8bc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.503360] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Created directory with path [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 870.503556] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Fetch image to [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 870.503717] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 870.504542] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c047a9e7-749e-458b-951e-ab1d8ef87c59 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.512494] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c1ef9c4-a138-4213-9527-730212ce2df0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.523076] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3704cffb-fd96-482d-a78b-ffa4941d3856 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.560888] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22c3283b-7481-4dd5-9b84-a246aead4436 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.567275] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-aec39de6-c3cf-4408-b04c-b0c7d044a753 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 870.590042] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 870.634600] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 870.695081] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 870.695081] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 870.872615] env[59490]: DEBUG nova.compute.manager [req-b34316f8-fc3d-4afc-8411-8013e7c94a5c req-1d8ba487-37ca-45f3-8a9e-bebbd7f0d31d service nova] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Received event network-vif-deleted-012003c2-2cb2-4fd7-87d7-79aa1f6c4a50 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 870.887835] env[59490]: DEBUG oslo_vmware.api [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': task-707423, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.032439} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 870.889976] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 870.889976] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 870.889976] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 870.889976] env[59490]: INFO nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Took 0.57 seconds to destroy the instance on the hypervisor. [ 870.889976] env[59490]: DEBUG oslo.service.loopingcall [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 870.890214] env[59490]: DEBUG nova.compute.manager [-] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 870.892009] env[59490]: DEBUG nova.compute.claims [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 870.892187] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 870.892389] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.016342] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Refreshing inventories for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 871.030223] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Updating ProviderTree inventory for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 871.030433] env[59490]: DEBUG nova.compute.provider_tree [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Updating inventory in ProviderTree for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 871.043625] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Refreshing aggregate associations for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20, aggregates: None {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 871.065106] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Refreshing trait associations for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 871.348518] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07d4cc92-8d17-4925-a55d-bff25e2b7b77 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.355241] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-630b830f-35c3-4c6d-925e-0e241280fcab {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.391345] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff9fcea0-5202-41c0-82a2-5e5bdee3e56b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.397984] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-019cf922-8b3a-41b0-a9f6-3a2a912fd03c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 871.413138] env[59490]: DEBUG nova.compute.provider_tree [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 871.425092] env[59490]: DEBUG nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 871.442047] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.550s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.442574] env[59490]: ERROR nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 871.442574] env[59490]: Faults: ['InvalidArgument'] [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Traceback (most recent call last): [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self.driver.spawn(context, instance, image_meta, [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self._fetch_image_if_missing(context, vi) [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] image_cache(vi, tmp_image_ds_loc) [ 871.442574] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] vm_util.copy_virtual_disk( [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] session._wait_for_task(vmdk_copy_task) [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return self.wait_for_task(task_ref) [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return evt.wait() [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] result = hub.switch() [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] return self.greenlet.switch() [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 871.442930] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] self.f(*self.args, **self.kw) [ 871.443336] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 871.443336] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] raise exceptions.translate_fault(task_info.error) [ 871.443336] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 871.443336] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Faults: ['InvalidArgument'] [ 871.443336] env[59490]: ERROR nova.compute.manager [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] [ 871.443336] env[59490]: DEBUG nova.compute.utils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 871.444883] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Build of instance 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 was re-scheduled: A specified parameter was not correct: fileType [ 871.444883] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 871.445275] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 871.445506] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 871.445663] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 871.445815] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 871.510407] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 871.601815] env[59490]: DEBUG nova.network.neutron [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 871.614949] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "refresh_cache-3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 871.614949] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 871.614949] env[59490]: DEBUG nova.compute.manager [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 871.710493] env[59490]: INFO nova.scheduler.client.report [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Deleted allocations for instance 3edf10fd-14d4-4430-9c6c-1ab0cbc689a4 [ 871.733807] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7ac56970-4272-423b-a787-45614bb85e59 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "3edf10fd-14d4-4430-9c6c-1ab0cbc689a4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 196.597s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 871.758915] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 871.815301] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 871.815532] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 871.818913] env[59490]: INFO nova.compute.claims [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 872.152388] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5fbc5829-f26a-458c-8f05-6a5dd88f9217 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.167019] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebc7fe64-1b7c-44af-8b39-443c0c423b54 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.200181] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c52fab44-8c55-4334-bc25-95050513555e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.208914] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf899068-3f48-42ce-96d2-b5ce62afdceb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.225381] env[59490]: DEBUG nova.compute.provider_tree [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 872.237846] env[59490]: DEBUG nova.scheduler.client.report [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 872.258552] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.443s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 872.259017] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 872.300267] env[59490]: DEBUG nova.compute.utils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 872.301440] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 872.301642] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 872.314504] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 872.396749] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 872.423742] env[59490]: DEBUG nova.policy [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '32fffc7664814bdba81ed340d27e444c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5530a0bb6d434878aed7b9c96009b416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 872.434189] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 872.435042] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 872.435042] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 872.435464] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 872.437574] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 872.437875] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 872.438277] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 872.438710] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 872.439030] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 872.439394] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 872.439627] env[59490]: DEBUG nova.virt.hardware [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 872.441475] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4da10af-8678-4ee8-9da3-dd72f72bf619 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 872.454160] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa388d56-17aa-4c45-a4f1-b4fd56096137 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 873.136190] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Successfully created port: 6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 873.821206] env[59490]: DEBUG nova.compute.manager [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Received event network-vif-plugged-6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 873.821425] env[59490]: DEBUG oslo_concurrency.lockutils [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] Acquiring lock "504e16b8-70d2-437f-ab3e-7631cb74abec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 873.821622] env[59490]: DEBUG oslo_concurrency.lockutils [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] Lock "504e16b8-70d2-437f-ab3e-7631cb74abec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 873.821778] env[59490]: DEBUG oslo_concurrency.lockutils [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] Lock "504e16b8-70d2-437f-ab3e-7631cb74abec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 873.821934] env[59490]: DEBUG nova.compute.manager [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] No waiting events found dispatching network-vif-plugged-6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 873.822406] env[59490]: WARNING nova.compute.manager [req-9b9dc781-3102-4849-8cfb-4f12b1a4135b req-4d4462e1-fbb5-4188-863d-4e8ab0132682 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Received unexpected event network-vif-plugged-6c8c38dc-b04a-4682-9ff2-f0d17854a53c for instance with vm_state building and task_state spawning. [ 873.930765] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Successfully updated port: 6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 873.939425] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 873.939577] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 873.939865] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 874.003945] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 874.225913] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Updating instance_info_cache with network_info: [{"id": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "address": "fa:16:3e:e1:05:96", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c8c38dc-b0", "ovs_interfaceid": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 874.238771] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 874.242110] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance network_info: |[{"id": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "address": "fa:16:3e:e1:05:96", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c8c38dc-b0", "ovs_interfaceid": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 874.242421] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e1:05:96', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c2d3bf80-d60a-4b53-a00a-1381de6d4a12', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6c8c38dc-b04a-4682-9ff2-f0d17854a53c', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 874.247510] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating folder: Project (5530a0bb6d434878aed7b9c96009b416). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 874.247927] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-92fad31f-4567-4b58-b032-d1c897d09482 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 874.260009] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created folder: Project (5530a0bb6d434878aed7b9c96009b416) in parent group-v168905. [ 874.261140] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating folder: Instances. Parent ref: group-v168956. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 874.261140] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e108907-2590-4ae6-9616-5125bc81f0a4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 874.269241] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created folder: Instances in parent group-v168956. [ 874.269441] env[59490]: DEBUG oslo.service.loopingcall [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 874.269611] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 874.269807] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-38a78264-653e-4cc6-bf3c-d20ca6fa3658 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 874.289817] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 874.289817] env[59490]: value = "task-707426" [ 874.289817] env[59490]: _type = "Task" [ 874.289817] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 874.300722] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707426, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 874.636442] env[59490]: DEBUG nova.compute.manager [req-c90fb818-55ff-4c4a-8dbf-92ce823bd05a req-bcc1ddb6-6fba-4e0f-b301-b488dcacc2b4 service nova] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Received event network-vif-deleted-b8dc67a8-c070-49a3-af75-8091871a2e25 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 874.804786] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707426, 'name': CreateVM_Task, 'duration_secs': 0.292766} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 874.804967] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 874.805666] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 874.805848] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 874.806189] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 874.806483] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bde5f05f-5746-4b0e-bba1-0de85ef07b47 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 874.811310] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 874.811310] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5252202e-0fa7-2436-596d-bfa73d8329eb" [ 874.811310] env[59490]: _type = "Task" [ 874.811310] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 874.820156] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5252202e-0fa7-2436-596d-bfa73d8329eb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 875.323713] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 875.324058] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 875.324158] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 875.935489] env[59490]: DEBUG nova.compute.manager [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Received event network-changed-6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 875.935736] env[59490]: DEBUG nova.compute.manager [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Refreshing instance network info cache due to event network-changed-6c8c38dc-b04a-4682-9ff2-f0d17854a53c. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 875.935796] env[59490]: DEBUG oslo_concurrency.lockutils [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] Acquiring lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 875.935918] env[59490]: DEBUG oslo_concurrency.lockutils [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] Acquired lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 875.936082] env[59490]: DEBUG nova.network.neutron [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Refreshing network info cache for port 6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 876.210312] env[59490]: DEBUG nova.network.neutron [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Updated VIF entry in instance network info cache for port 6c8c38dc-b04a-4682-9ff2-f0d17854a53c. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 876.210643] env[59490]: DEBUG nova.network.neutron [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Updating instance_info_cache with network_info: [{"id": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "address": "fa:16:3e:e1:05:96", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6c8c38dc-b0", "ovs_interfaceid": "6c8c38dc-b04a-4682-9ff2-f0d17854a53c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 876.220012] env[59490]: DEBUG oslo_concurrency.lockutils [req-2b41c0c1-c47d-4aef-9acb-8714192310a4 req-c6ebeb5e-86b9-44f8-84e7-48550a57f5c5 service nova] Releasing lock "refresh_cache-504e16b8-70d2-437f-ab3e-7631cb74abec" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 878.507175] env[59490]: DEBUG nova.compute.manager [req-089c3657-ead5-408f-abd4-d86a6de54d04 req-b99afb22-1484-420f-9623-672a1e1f432c service nova] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Received event network-vif-deleted-72ccc8c5-8b84-4ce5-a12a-16920839c294 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 878.694112] env[59490]: DEBUG nova.compute.manager [req-f5135a98-b561-4a54-b488-4a31e5f88900 req-e62a29af-68d5-4488-94b6-93b75e0c342c service nova] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Received event network-vif-deleted-81b838dd-6028-40eb-ad00-c1499bff521a {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 880.649956] env[59490]: DEBUG nova.compute.manager [req-29594eb9-c0da-44e2-99cb-a845d329a434 req-d07288cf-907a-4271-8f63-a9e6bc328f42 service nova] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Received event network-vif-deleted-b662ce79-2d96-4f63-9fda-6795b5a9e8ca {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 881.356325] env[59490]: DEBUG nova.compute.manager [req-b283a6d2-3eee-4d02-bff1-5d62c31b98dd req-736b37e5-5271-452f-a4f9-3d9f4bcd1c1c service nova] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Received event network-vif-deleted-889c28a1-e5e9-46f2-8d60-a4416d197765 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 883.429106] env[59490]: DEBUG nova.compute.manager [req-46791b7c-8453-46b8-809e-9004c608c7fc req-dcb27d73-fc36-431d-a7b8-2da9a407b685 service nova] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Received event network-vif-deleted-b0275025-626e-4293-bb18-a14ae7ed9ca5 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 886.384577] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 886.385266] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 886.385600] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 886.400023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.400023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.400023] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 886.400023] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 886.400023] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da440f03-a63b-446c-940d-d1de232329c7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.411014] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b44ece6-efe1-4a60-9aac-19e445ed06b5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.432709] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efe88fa7-1756-488f-8649-c5c8fe3dd687 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.443903] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc9e9eaf-1689-4a89-81bc-a1de9dd92648 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.472124] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181645MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 886.472293] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 886.472558] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 886.531051] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 504e16b8-70d2-437f-ab3e-7631cb74abec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 886.553212] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 12082268-a4a2-4eb8-9adc-93c7e7d82c42 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.567838] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance de66fe1b-8f03-4a10-af9d-302cb5021f79 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.584511] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.603691] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance c1a44e57-5a85-475c-898e-9f30e0c6b492 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.620791] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2cbf0a49-2835-41c2-8840-9515f1e95d5a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.636620] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d5b78319-88f9-4771-8b14-e833d05eb3d6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.649204] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance cfe59672-be1d-43ed-b8d4-b5ed51e08a34 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.660135] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ead3d36-7d65-4e6d-be85-a6736acd3802 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.679352] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 886.679352] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 886.679352] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 886.866911] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1360324-e791-4d36-bce4-b95658338f3f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.877933] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f033804-ba7f-4d8c-a754-310bea0d8c1e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.914907] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62947298-c3d7-407a-b5bb-04dd3953c55d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.924685] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43ea86e1-930a-418c-afe1-ebdaa76f5660 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 886.937207] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 886.951749] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 886.967187] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 886.967394] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.495s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 887.968667] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 889.379542] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 889.386206] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 889.386206] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 890.379667] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 890.401046] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 890.401046] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 890.401046] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 890.417398] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 890.419066] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 890.419066] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 892.384617] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 892.872272] env[59490]: DEBUG nova.compute.manager [req-f3d79ff2-d0fb-4e29-b535-da26c95e6ea4 req-645b1020-2ce3-4342-a3e1-7073476c7d41 service nova] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Received event network-vif-deleted-6c8c38dc-b04a-4682-9ff2-f0d17854a53c {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 894.842310] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 894.843361] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 895.448188] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "ddbac2db-c555-4554-aa21-7303c8e36371" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 895.448188] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "ddbac2db-c555-4554-aa21-7303c8e36371" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 895.481638] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 895.482463] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 896.738402] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "e879cc90-f290-42cd-9059-46f42284a32c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 896.738819] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "e879cc90-f290-42cd-9059-46f42284a32c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 897.461773] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "f6d58f5a-f432-47a2-af63-033ae4c3d414" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 897.461773] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "f6d58f5a-f432-47a2-af63-033ae4c3d414" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 898.876681] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "f4bbfad2-f118-4292-bb36-4229c333dd4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 898.877013] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "f4bbfad2-f118-4292-bb36-4229c333dd4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.585798] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "014bca6d-9df7-4245-90b4-3f291262292a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.586039] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "014bca6d-9df7-4245-90b4-3f291262292a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 899.958600] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "d0673be9-d670-4d3f-aefa-26f4e336a695" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 899.958600] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "d0673be9-d670-4d3f-aefa-26f4e336a695" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 906.165646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80487f3d-8756-42c9-9e2b-083312244fe0 tempest-ServerTagsTestJSON-1826652627 tempest-ServerTagsTestJSON-1826652627-project-member] Acquiring lock "ecb7312c-80f0-490e-8357-7138680d0f90" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 906.165646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80487f3d-8756-42c9-9e2b-083312244fe0 tempest-ServerTagsTestJSON-1826652627 tempest-ServerTagsTestJSON-1826652627-project-member] Lock "ecb7312c-80f0-490e-8357-7138680d0f90" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 908.547383] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e17cc6b6-765a-479a-96ec-ba00b4be8ca5 tempest-ServerAddressesNegativeTestJSON-326820771 tempest-ServerAddressesNegativeTestJSON-326820771-project-member] Acquiring lock "e24d5bbc-6168-4523-9a0c-cd29c14c9e56" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 908.547383] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e17cc6b6-765a-479a-96ec-ba00b4be8ca5 tempest-ServerAddressesNegativeTestJSON-326820771 tempest-ServerAddressesNegativeTestJSON-326820771-project-member] Lock "e24d5bbc-6168-4523-9a0c-cd29c14c9e56" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 909.133387] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1d466185-1143-432b-a936-8d3a1690079b tempest-ServerMetadataNegativeTestJSON-221694196 tempest-ServerMetadataNegativeTestJSON-221694196-project-member] Acquiring lock "643bfd74-592a-452c-af62-ded4c23009f9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 909.133876] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1d466185-1143-432b-a936-8d3a1690079b tempest-ServerMetadataNegativeTestJSON-221694196 tempest-ServerMetadataNegativeTestJSON-221694196-project-member] Lock "643bfd74-592a-452c-af62-ded4c23009f9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 920.705029] env[59490]: WARNING oslo_vmware.rw_handles [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 920.705029] env[59490]: ERROR oslo_vmware.rw_handles [ 920.705540] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 920.707763] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 920.708304] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Copying Virtual Disk [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/4386753e-a28a-46d7-84ed-8da7f5588ca3/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 920.708304] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f1ae65fd-f71d-45af-b594-9143da6a1b3e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 920.715737] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Waiting for the task: (returnval){ [ 920.715737] env[59490]: value = "task-707427" [ 920.715737] env[59490]: _type = "Task" [ 920.715737] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 920.723433] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Task: {'id': task-707427, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 921.226506] env[59490]: DEBUG oslo_vmware.exceptions [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 921.226819] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 921.227355] env[59490]: ERROR nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 921.227355] env[59490]: Faults: ['InvalidArgument'] [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Traceback (most recent call last): [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] yield resources [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] self.driver.spawn(context, instance, image_meta, [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] self._fetch_image_if_missing(context, vi) [ 921.227355] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] image_cache(vi, tmp_image_ds_loc) [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] vm_util.copy_virtual_disk( [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] session._wait_for_task(vmdk_copy_task) [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] return self.wait_for_task(task_ref) [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] return evt.wait() [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] result = hub.switch() [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 921.227701] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] return self.greenlet.switch() [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] self.f(*self.args, **self.kw) [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] raise exceptions.translate_fault(task_info.error) [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Faults: ['InvalidArgument'] [ 921.227977] env[59490]: ERROR nova.compute.manager [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] [ 921.227977] env[59490]: INFO nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Terminating instance [ 921.229175] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 921.229843] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 921.229843] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-545b7208-e3f8-4d10-bd00-cc745a6ded04 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.231964] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 921.232161] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 921.232863] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3cefba2a-9a4d-4c0b-8a18-a8c7f5f10075 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.239796] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 921.239987] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-edfee9e2-2f35-4e11-896f-fabb646ea94e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.242138] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 921.242301] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 921.243231] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-38d16776-ca50-4adc-8c64-538fb3dd310e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.248133] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Waiting for the task: (returnval){ [ 921.248133] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52890be7-0864-3bda-c69c-80b125717c22" [ 921.248133] env[59490]: _type = "Task" [ 921.248133] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 921.255018] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52890be7-0864-3bda-c69c-80b125717c22, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 921.310534] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 921.310751] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 921.310922] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Deleting the datastore file [datastore2] 71698ce4-94a0-442c-8081-374616ce2ac4 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 921.311197] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ecc74685-5e8d-472f-a8fc-29ef88b60438 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.317812] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Waiting for the task: (returnval){ [ 921.317812] env[59490]: value = "task-707429" [ 921.317812] env[59490]: _type = "Task" [ 921.317812] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 921.325507] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Task: {'id': task-707429, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 921.759290] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 921.759614] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Creating directory with path [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 921.759857] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9875ad84-0c1f-4f45-9586-2166106cd49b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.770770] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Created directory with path [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 921.770953] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Fetch image to [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 921.771135] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 921.771906] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dab79f72-0179-4085-aa70-95d264632aef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.779955] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f278d92-4fbb-4463-a386-198e6a023cd7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.788838] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd27b303-2d0f-4a4a-a205-aabcc94ed9e1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.821868] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb2b4c2f-d6b6-4991-8401-ccc1bf8c8186 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.830222] env[59490]: DEBUG oslo_vmware.api [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Task: {'id': task-707429, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073814} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 921.830701] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 921.830875] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 921.831081] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 921.831265] env[59490]: INFO nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 921.832803] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-58315ab8-c81a-4870-8391-c035e53bc92f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 921.834660] env[59490]: DEBUG nova.compute.claims [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 921.834826] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 921.835038] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 921.860305] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 921.865044] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.030s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 921.865447] env[59490]: DEBUG nova.compute.utils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance 71698ce4-94a0-442c-8081-374616ce2ac4 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 921.867383] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 921.867599] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 921.867774] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 921.867936] env[59490]: DEBUG nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 921.868112] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 921.900048] env[59490]: DEBUG nova.network.neutron [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 921.907303] env[59490]: DEBUG oslo_vmware.rw_handles [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 921.960417] env[59490]: INFO nova.compute.manager [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Took 0.09 seconds to deallocate network for instance. [ 921.966271] env[59490]: DEBUG oslo_vmware.rw_handles [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 921.966435] env[59490]: DEBUG oslo_vmware.rw_handles [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 922.006261] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a2aff7b-dbb7-43b4-ba60-643a19c2178a tempest-ServerActionsTestOtherA-352470216 tempest-ServerActionsTestOtherA-352470216-project-member] Lock "71698ce4-94a0-442c-8081-374616ce2ac4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 254.186s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.015706] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 12082268-a4a2-4eb8-9adc-93c7e7d82c42] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.040254] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 12082268-a4a2-4eb8-9adc-93c7e7d82c42] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.061008] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "12082268-a4a2-4eb8-9adc-93c7e7d82c42" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 231.177s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.069388] env[59490]: DEBUG nova.compute.manager [None req-9ac777b5-df98-4a01-9282-2b266d082fd4 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: de66fe1b-8f03-4a10-af9d-302cb5021f79] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.091527] env[59490]: DEBUG nova.compute.manager [None req-9ac777b5-df98-4a01-9282-2b266d082fd4 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: de66fe1b-8f03-4a10-af9d-302cb5021f79] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.110608] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9ac777b5-df98-4a01-9282-2b266d082fd4 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "de66fe1b-8f03-4a10-af9d-302cb5021f79" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.750s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.118991] env[59490]: DEBUG nova.compute.manager [None req-911027a7-e39b-481a-816b-35a39c0d4c61 tempest-ServersV294TestFqdnHostnames-1291042165 tempest-ServersV294TestFqdnHostnames-1291042165-project-member] [instance: b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.143690] env[59490]: DEBUG nova.compute.manager [None req-911027a7-e39b-481a-816b-35a39c0d4c61 tempest-ServersV294TestFqdnHostnames-1291042165 tempest-ServersV294TestFqdnHostnames-1291042165-project-member] [instance: b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.166411] env[59490]: DEBUG oslo_concurrency.lockutils [None req-911027a7-e39b-481a-816b-35a39c0d4c61 tempest-ServersV294TestFqdnHostnames-1291042165 tempest-ServersV294TestFqdnHostnames-1291042165-project-member] Lock "b1e2f9d4-76e4-4c2e-86cb-3de1fb9dd364" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.975s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.177637] env[59490]: DEBUG nova.compute.manager [None req-b84f1516-ea7a-4dcc-adff-48af9aaea268 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: c1a44e57-5a85-475c-898e-9f30e0c6b492] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.209021] env[59490]: DEBUG nova.compute.manager [None req-b84f1516-ea7a-4dcc-adff-48af9aaea268 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: c1a44e57-5a85-475c-898e-9f30e0c6b492] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.231013] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b84f1516-ea7a-4dcc-adff-48af9aaea268 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "c1a44e57-5a85-475c-898e-9f30e0c6b492" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.939s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.247258] env[59490]: DEBUG nova.compute.manager [None req-1e98a9f3-ec85-41f4-a911-c9cb66091dbd tempest-SecurityGroupsTestJSON-19223672 tempest-SecurityGroupsTestJSON-19223672-project-member] [instance: 2cbf0a49-2835-41c2-8840-9515f1e95d5a] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.283706] env[59490]: DEBUG nova.compute.manager [None req-1e98a9f3-ec85-41f4-a911-c9cb66091dbd tempest-SecurityGroupsTestJSON-19223672 tempest-SecurityGroupsTestJSON-19223672-project-member] [instance: 2cbf0a49-2835-41c2-8840-9515f1e95d5a] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.312983] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1e98a9f3-ec85-41f4-a911-c9cb66091dbd tempest-SecurityGroupsTestJSON-19223672 tempest-SecurityGroupsTestJSON-19223672-project-member] Lock "2cbf0a49-2835-41c2-8840-9515f1e95d5a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.948s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.322510] env[59490]: DEBUG nova.compute.manager [None req-60684222-8bfa-400a-8103-572d32c227f1 tempest-ServerActionsTestOtherB-715024718 tempest-ServerActionsTestOtherB-715024718-project-member] [instance: d5b78319-88f9-4771-8b14-e833d05eb3d6] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.348471] env[59490]: DEBUG nova.compute.manager [None req-60684222-8bfa-400a-8103-572d32c227f1 tempest-ServerActionsTestOtherB-715024718 tempest-ServerActionsTestOtherB-715024718-project-member] [instance: d5b78319-88f9-4771-8b14-e833d05eb3d6] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.370765] env[59490]: DEBUG oslo_concurrency.lockutils [None req-60684222-8bfa-400a-8103-572d32c227f1 tempest-ServerActionsTestOtherB-715024718 tempest-ServerActionsTestOtherB-715024718-project-member] Lock "d5b78319-88f9-4771-8b14-e833d05eb3d6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 225.506s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.379789] env[59490]: DEBUG nova.compute.manager [None req-8e2b81fd-877f-49e5-8e94-747b25a8b66a tempest-ServerGroupTestJSON-200953235 tempest-ServerGroupTestJSON-200953235-project-member] [instance: cfe59672-be1d-43ed-b8d4-b5ed51e08a34] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.407389] env[59490]: DEBUG nova.compute.manager [None req-8e2b81fd-877f-49e5-8e94-747b25a8b66a tempest-ServerGroupTestJSON-200953235 tempest-ServerGroupTestJSON-200953235-project-member] [instance: cfe59672-be1d-43ed-b8d4-b5ed51e08a34] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 922.428273] env[59490]: DEBUG oslo_concurrency.lockutils [None req-8e2b81fd-877f-49e5-8e94-747b25a8b66a tempest-ServerGroupTestJSON-200953235 tempest-ServerGroupTestJSON-200953235-project-member] Lock "cfe59672-be1d-43ed-b8d4-b5ed51e08a34" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.068s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.438009] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 922.485370] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 922.485598] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 922.487096] env[59490]: INFO nova.compute.claims [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 922.667864] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6e3e54b-01a2-4b98-839c-a60462c12eeb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.676115] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30377e61-c837-4eca-9b5e-b09740ab16cf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.705085] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3972a4f0-2f50-400f-adf9-d15b4eac9004 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.712424] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97a90b63-d321-425a-8f62-200a1b8a1113 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.725502] env[59490]: DEBUG nova.compute.provider_tree [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 922.733909] env[59490]: DEBUG nova.scheduler.client.report [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 922.747030] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.261s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 922.747392] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 922.812755] env[59490]: DEBUG nova.compute.utils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 922.813936] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 922.814742] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 922.823031] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 922.869500] env[59490]: DEBUG nova.policy [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '59edea3a90eb45c28fbb5ceb426d0629', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '312c91f87af54c4abacd034186d368d4', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 922.883061] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 922.903902] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 922.904154] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 922.904325] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 922.904541] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 922.904713] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 922.904864] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 922.905104] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 922.905271] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 922.905438] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 922.905599] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 922.905767] env[59490]: DEBUG nova.virt.hardware [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 922.906664] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fe3dedf-b5be-4953-93e8-5ab7d631995b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 922.915536] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-568d5a55-b09d-4336-9dac-a5af2273ebbb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 923.142016] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Successfully created port: 41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 923.595585] env[59490]: DEBUG nova.compute.manager [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Received event network-vif-plugged-41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 923.595804] env[59490]: DEBUG oslo_concurrency.lockutils [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] Acquiring lock "0ead3d36-7d65-4e6d-be85-a6736acd3802-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 923.596007] env[59490]: DEBUG oslo_concurrency.lockutils [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] Lock "0ead3d36-7d65-4e6d-be85-a6736acd3802-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 923.596167] env[59490]: DEBUG oslo_concurrency.lockutils [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] Lock "0ead3d36-7d65-4e6d-be85-a6736acd3802-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 923.596321] env[59490]: DEBUG nova.compute.manager [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] No waiting events found dispatching network-vif-plugged-41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 923.596615] env[59490]: WARNING nova.compute.manager [req-9cd965db-a301-4da1-9957-8b4edd3173f1 req-281a8001-ca0a-4223-a0e0-87e8ab39134b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Received unexpected event network-vif-plugged-41409afd-dae8-473b-b3ba-424e7b48999b for instance with vm_state building and task_state spawning. [ 923.670900] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Successfully updated port: 41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 923.680445] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 923.680591] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 923.680731] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 923.720116] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 923.877946] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Updating instance_info_cache with network_info: [{"id": "41409afd-dae8-473b-b3ba-424e7b48999b", "address": "fa:16:3e:98:81:2c", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41409afd-da", "ovs_interfaceid": "41409afd-dae8-473b-b3ba-424e7b48999b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 923.888468] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 923.888680] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance network_info: |[{"id": "41409afd-dae8-473b-b3ba-424e7b48999b", "address": "fa:16:3e:98:81:2c", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41409afd-da", "ovs_interfaceid": "41409afd-dae8-473b-b3ba-424e7b48999b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 923.889068] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:98:81:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3c405e9f-a6c8-4308-acac-071654efe18e', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '41409afd-dae8-473b-b3ba-424e7b48999b', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 923.896944] env[59490]: DEBUG oslo.service.loopingcall [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 923.897375] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 923.897610] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1d05d7cc-fe61-44b1-aa39-8116e88539ec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 923.918042] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 923.918042] env[59490]: value = "task-707430" [ 923.918042] env[59490]: _type = "Task" [ 923.918042] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 923.925704] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707430, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 924.428647] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707430, 'name': CreateVM_Task, 'duration_secs': 0.266968} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 924.428817] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 924.429478] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 924.429634] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 924.429932] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 924.430175] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17b16a07-6ff2-4d6e-bfeb-3715d5a488cf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 924.434554] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 924.434554] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]523f0ec8-70b9-4213-5ab7-a43ab0895b32" [ 924.434554] env[59490]: _type = "Task" [ 924.434554] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 924.441797] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]523f0ec8-70b9-4213-5ab7-a43ab0895b32, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 924.945502] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 924.945784] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 924.945950] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 925.622814] env[59490]: DEBUG nova.compute.manager [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Received event network-changed-41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 925.623044] env[59490]: DEBUG nova.compute.manager [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Refreshing instance network info cache due to event network-changed-41409afd-dae8-473b-b3ba-424e7b48999b. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 925.623255] env[59490]: DEBUG oslo_concurrency.lockutils [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] Acquiring lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 925.623346] env[59490]: DEBUG oslo_concurrency.lockutils [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] Acquired lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 925.623498] env[59490]: DEBUG nova.network.neutron [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Refreshing network info cache for port 41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 925.840888] env[59490]: DEBUG nova.network.neutron [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Updated VIF entry in instance network info cache for port 41409afd-dae8-473b-b3ba-424e7b48999b. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 925.841251] env[59490]: DEBUG nova.network.neutron [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Updating instance_info_cache with network_info: [{"id": "41409afd-dae8-473b-b3ba-424e7b48999b", "address": "fa:16:3e:98:81:2c", "network": {"id": "234b4228-8801-458b-8284-3289c056ac94", "bridge": "br-int", "label": "tempest-ServersTestJSON-549939068-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "312c91f87af54c4abacd034186d368d4", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3c405e9f-a6c8-4308-acac-071654efe18e", "external-id": "nsx-vlan-transportzone-851", "segmentation_id": 851, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap41409afd-da", "ovs_interfaceid": "41409afd-dae8-473b-b3ba-424e7b48999b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 925.850487] env[59490]: DEBUG oslo_concurrency.lockutils [req-cb2c05d7-6e9b-4adb-a38d-515a14f05a7f req-cf01f932-5062-4c8d-b855-ae03db4fa07b service nova] Releasing lock "refresh_cache-0ead3d36-7d65-4e6d-be85-a6736acd3802" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 946.385121] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 946.385487] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 946.385487] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 946.395705] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.395898] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.396081] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 946.396253] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 946.397340] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3db0d13a-537f-448c-bd36-5e3ebe7a513d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.407151] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52787312-fe40-4769-8a0f-c522f5d627e3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.420761] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8783245-ea83-475e-8098-97c384a111c3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.426967] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-639c7c86-2cf6-4cf5-8d9d-62cf3cd0fe22 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.455998] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181678MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 946.456145] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 946.456344] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 946.496051] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 0ead3d36-7d65-4e6d-be85-a6736acd3802 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 946.507164] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.517952] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.527961] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ddbac2db-c555-4554-aa21-7303c8e36371 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.538645] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.549836] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e879cc90-f290-42cd-9059-46f42284a32c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.560317] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.569665] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f4bbfad2-f118-4292-bb36-4229c333dd4c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.579327] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 014bca6d-9df7-4245-90b4-3f291262292a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.591583] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d0673be9-d670-4d3f-aefa-26f4e336a695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.600556] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ecb7312c-80f0-490e-8357-7138680d0f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.609854] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e24d5bbc-6168-4523-9a0c-cd29c14c9e56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.619288] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 643bfd74-592a-452c-af62-ded4c23009f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 946.619490] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 946.619631] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 946.764866] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3d5bae9-7e26-4108-b5b4-ee91305c9b4c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.772916] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975cd880-c058-4e68-a754-61e9938fe3c9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.802153] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c886c226-12c5-43f7-9bb6-05452ec985a8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.809286] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5799535-1969-46d1-8bf3-52deec862029 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 946.821533] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 946.829901] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 946.843881] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 946.844058] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.388s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 947.843473] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 949.384141] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 950.379560] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 950.383166] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 950.383321] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 950.383437] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 950.393186] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 950.393512] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 950.393512] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 951.384351] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 954.384470] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 970.374751] env[59490]: WARNING oslo_vmware.rw_handles [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 970.374751] env[59490]: ERROR oslo_vmware.rw_handles [ 970.375404] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 970.376977] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 970.377240] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Copying Virtual Disk [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/0bb33e3b-5743-4583-9d07-82a30921486b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 970.377572] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-38f76e8c-b902-4f71-8143-75afd1749375 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.384792] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Waiting for the task: (returnval){ [ 970.384792] env[59490]: value = "task-707431" [ 970.384792] env[59490]: _type = "Task" [ 970.384792] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 970.394426] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Task: {'id': task-707431, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 970.895438] env[59490]: DEBUG oslo_vmware.exceptions [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 970.895694] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 970.897025] env[59490]: ERROR nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 970.897025] env[59490]: Faults: ['InvalidArgument'] [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] yield resources [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.driver.spawn(context, instance, image_meta, [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._fetch_image_if_missing(context, vi) [ 970.897025] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] image_cache(vi, tmp_image_ds_loc) [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] vm_util.copy_virtual_disk( [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] session._wait_for_task(vmdk_copy_task) [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.wait_for_task(task_ref) [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return evt.wait() [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] result = hub.switch() [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 970.897468] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.greenlet.switch() [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.f(*self.args, **self.kw) [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise exceptions.translate_fault(task_info.error) [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Faults: ['InvalidArgument'] [ 970.897765] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 970.897765] env[59490]: INFO nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Terminating instance [ 970.898123] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 970.898337] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 970.898556] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8ce2c832-7f78-492d-8c03-8760730e782c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.900668] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 970.900853] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 970.901558] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13a9d2ab-67f6-4aed-99ce-2843c0932af3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.908362] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 970.908588] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2d0a5081-4df0-4735-8765-b1b012404c66 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.910681] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 970.910844] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 970.911799] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-908518c6-9c74-48f4-9bc9-bac2ec6f28d7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.916461] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Waiting for the task: (returnval){ [ 970.916461] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]524054bc-007c-d83e-7e45-94ba7667d0e4" [ 970.916461] env[59490]: _type = "Task" [ 970.916461] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 970.924615] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]524054bc-007c-d83e-7e45-94ba7667d0e4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 970.975257] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 970.975569] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 970.975660] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Deleting the datastore file [datastore2] ad8223ea-b097-439f-bcff-9c06bd1cf5e6 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 970.977020] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5d16040d-3997-4dbf-87ec-aa1cbbafff97 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 970.982405] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Waiting for the task: (returnval){ [ 970.982405] env[59490]: value = "task-707433" [ 970.982405] env[59490]: _type = "Task" [ 970.982405] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 970.989835] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Task: {'id': task-707433, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.426554] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 971.426907] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Creating directory with path [datastore2] vmware_temp/a1be5613-bd42-478d-8267-60567c9dd618/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 971.427019] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b1b7c190-2c04-4a2f-8ebb-5a1843d7c1ef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.438155] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Created directory with path [datastore2] vmware_temp/a1be5613-bd42-478d-8267-60567c9dd618/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 971.438357] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Fetch image to [datastore2] vmware_temp/a1be5613-bd42-478d-8267-60567c9dd618/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 971.438581] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/a1be5613-bd42-478d-8267-60567c9dd618/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 971.439556] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c149782d-0469-4a92-a5c8-f06bcbb621f7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.447847] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57dbb942-3b08-48c9-8329-7c4df8589914 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.460060] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e44a4ebe-aafd-4794-b21d-4696c946b936 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.504674] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13414cbb-cd1b-4486-9359-735d83389efc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.512364] env[59490]: DEBUG oslo_vmware.api [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Task: {'id': task-707433, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075598} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 971.513721] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 971.513901] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 971.514080] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 971.514251] env[59490]: INFO nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Took 0.61 seconds to destroy the instance on the hypervisor. [ 971.516199] env[59490]: DEBUG nova.compute.claims [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 971.516355] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.516588] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.519105] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-922bf4df-5802-4c59-a06b-a7f330e32c4a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.543973] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.544726] env[59490]: DEBUG nova.compute.utils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 971.546106] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 971.546270] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 971.546474] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 971.546642] env[59490]: DEBUG nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 971.546795] env[59490]: DEBUG nova.network.neutron [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 971.618512] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 971.679649] env[59490]: DEBUG neutronclient.v2_0.client [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 971.682701] env[59490]: ERROR nova.compute.manager [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.driver.spawn(context, instance, image_meta, [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._fetch_image_if_missing(context, vi) [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] image_cache(vi, tmp_image_ds_loc) [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 971.682701] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] vm_util.copy_virtual_disk( [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] session._wait_for_task(vmdk_copy_task) [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.wait_for_task(task_ref) [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return evt.wait() [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] result = hub.switch() [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.greenlet.switch() [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.f(*self.args, **self.kw) [ 971.683189] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise exceptions.translate_fault(task_info.error) [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Faults: ['InvalidArgument'] [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] During handling of the above exception, another exception occurred: [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._build_and_run_instance(context, instance, image, [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] with excutils.save_and_reraise_exception(): [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.force_reraise() [ 971.683691] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise self.value [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] with self.rt.instance_claim(context, instance, node, allocs, [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.abort() [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return f(*args, **kwargs) [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._unset_instance_host_and_node(instance) [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 971.684129] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] instance.save() [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] updates, result = self.indirection_api.object_action( [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return cctxt.call(context, 'object_action', objinst=objinst, [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] result = self.transport._send( [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self._driver.send(target, ctxt, message, [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 971.684461] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise result [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] nova.exception_Remote.InstanceNotFound_Remote: Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 could not be found. [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return getattr(target, method)(*args, **kwargs) [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return fn(self, *args, **kwargs) [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] old_ref, inst_ref = db.instance_update_and_get_original( [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return f(*args, **kwargs) [ 971.684730] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] with excutils.save_and_reraise_exception() as ectxt: [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.force_reraise() [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise self.value [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return f(*args, **kwargs) [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return f(context, *args, **kwargs) [ 971.685069] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise exception.InstanceNotFound(instance_id=uuid) [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] nova.exception.InstanceNotFound: Instance ad8223ea-b097-439f-bcff-9c06bd1cf5e6 could not be found. [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] During handling of the above exception, another exception occurred: [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.685470] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] exception_handler_v20(status_code, error_body) [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise client_exc(message=error_message, [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Neutron server returns request_ids: ['req-870781a0-cd02-4ec0-9558-a9d79a09b57e'] [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] During handling of the above exception, another exception occurred: [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Traceback (most recent call last): [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._deallocate_network(context, instance, requested_networks) [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 971.686040] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self.network_api.deallocate_for_instance( [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] data = neutron.list_ports(**search_opts) [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.list('ports', self.ports_path, retrieve_all, [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] for r in self._pagination(collection, path, **params): [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] res = self.get(path, params=params) [ 971.686543] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.retry_request("GET", action, body=body, [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] return self.do_request(method, action, body=body, [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] ret = obj(*args, **kwargs) [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] self._handle_fault_response(status_code, replybody, resp) [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 971.687050] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] raise exception.Unauthorized() [ 971.687519] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] nova.exception.Unauthorized: Not authorized. [ 971.687519] env[59490]: ERROR nova.compute.manager [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] [ 971.702923] env[59490]: DEBUG oslo_concurrency.lockutils [None req-eb8b3b71-c6b5-478f-8cda-47563b84b5d3 tempest-VolumesAdminNegativeTest-1456867310 tempest-VolumesAdminNegativeTest-1456867310-project-member] Lock "ad8223ea-b097-439f-bcff-9c06bd1cf5e6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 298.442s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 971.713787] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 971.757753] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 971.758010] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 971.759881] env[59490]: INFO nova.compute.claims [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 971.800019] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 971.800019] env[59490]: ERROR nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] result = getattr(controller, method)(*args, **kwargs) [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 971.800019] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._get(image_id) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] resp, body = self.http_client.get(url, headers=header) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.request(url, 'GET', **kwargs) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._handle_response(resp) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exc.from_response(resp, resp.content) [ 971.800294] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] During handling of the above exception, another exception occurred: [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] yield resources [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.driver.spawn(context, instance, image_meta, [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._fetch_image_if_missing(context, vi) [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 971.800537] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image_fetch(context, vi, tmp_image_ds_loc) [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] images.fetch_image( [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] metadata = IMAGE_API.get(context, image_ref) [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return session.show(context, image_id, [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] _reraise_translated_image_exception(image_id) [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise new_exc.with_traceback(exc_trace) [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 971.800903] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] result = getattr(controller, method)(*args, **kwargs) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._get(image_id) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] resp, body = self.http_client.get(url, headers=header) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.request(url, 'GET', **kwargs) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._handle_response(resp) [ 971.801230] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 971.801500] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exc.from_response(resp, resp.content) [ 971.801500] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 971.801500] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 971.801500] env[59490]: INFO nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Terminating instance [ 971.803471] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 971.803684] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 971.804326] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 971.804514] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 971.807254] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-78a0c81b-c8db-41d5-beb9-cc2bfda82a01 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.810582] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13d00f8d-95e5-45f0-8b8b-005b903c2650 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.819021] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 971.819583] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3feb7683-032a-41d2-9940-ded72b707217 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.821179] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 971.821458] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 971.822323] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f6e71fc-56eb-4cc3-99ca-7ef3149e1d0f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.827691] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Waiting for the task: (returnval){ [ 971.827691] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52a2f855-4b24-122a-39db-4c26dda8672c" [ 971.827691] env[59490]: _type = "Task" [ 971.827691] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 971.838361] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52a2f855-4b24-122a-39db-4c26dda8672c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.888128] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 971.888357] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 971.888544] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Deleting the datastore file [datastore2] 0ec55812-86b7-44ef-822a-88a2ff1816c3 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 971.889432] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-14f42ef9-1ffa-4e1d-a942-760e65b70e0e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.895162] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Waiting for the task: (returnval){ [ 971.895162] env[59490]: value = "task-707435" [ 971.895162] env[59490]: _type = "Task" [ 971.895162] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 971.903265] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Task: {'id': task-707435, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 971.981697] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f56cda78-d48d-4cdb-8e8d-6c99f7b282b2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 971.988612] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45bdcb9-5a77-4ef0-b820-c76c8e17d20f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.018895] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ad557be-f30a-48f8-abba-3e5be35c70f6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.025963] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5128f90d-120a-48d1-a846-f184a1d4ee3d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.038558] env[59490]: DEBUG nova.compute.provider_tree [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 972.047037] env[59490]: DEBUG nova.scheduler.client.report [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 972.062258] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.304s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.062791] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 972.095435] env[59490]: DEBUG nova.compute.utils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 972.097120] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 972.097227] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 972.109649] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 972.157900] env[59490]: DEBUG nova.policy [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93256992d7a84e72882b4c132c337393', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2133066748948909baea488349a4b78', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 972.178525] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 972.200573] env[59490]: DEBUG nova.compute.manager [req-9c1a47f3-e7d5-4109-beb3-6d421f4069f4 req-9e3d76a2-4230-400f-9cf5-71558cbc325a service nova] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Received event network-vif-deleted-41409afd-dae8-473b-b3ba-424e7b48999b {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 972.201764] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 972.201764] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 972.201764] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 972.201898] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 972.201898] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 972.205020] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 972.205020] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 972.205020] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 972.205020] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 972.205020] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 972.205206] env[59490]: DEBUG nova.virt.hardware [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 972.205206] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-678e0ad0-f210-4547-9ae0-66c3720a4ae6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.213882] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e8a365c5-c29e-4b27-8fe2-db797f13f511 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.340622] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 972.340871] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Creating directory with path [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 972.341109] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-baeca1aa-6078-4259-9d15-87b542690c82 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.353637] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Created directory with path [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 972.353637] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Fetch image to [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 972.353637] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 972.354606] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1e572e6-d530-4207-bcd9-87278fac71c3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.361436] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3b8bf60b-47db-4e12-8a2c-8ea8ab58cf33 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.370660] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57d71c21-ae79-455f-ba2c-97f18a56e0f8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.406989] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0877e25-364a-406b-974d-8f53b588282f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.415031] env[59490]: DEBUG oslo_vmware.api [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Task: {'id': task-707435, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070949} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 972.416614] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 972.416917] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 972.416976] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 972.417134] env[59490]: INFO nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Took 0.61 seconds to destroy the instance on the hypervisor. [ 972.419290] env[59490]: DEBUG nova.compute.claims [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 972.419413] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 972.419611] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 972.424092] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ac59f0af-a356-444b-91cc-4a268656cc9f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.444188] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 972.452256] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.032s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.453086] env[59490]: DEBUG nova.compute.utils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 972.454535] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 972.454700] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 972.454853] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 972.455036] env[59490]: DEBUG nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 972.455435] env[59490]: DEBUG nova.network.neutron [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 972.499808] env[59490]: DEBUG oslo_vmware.rw_handles [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 972.552388] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Successfully created port: 3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 972.558096] env[59490]: DEBUG oslo_vmware.rw_handles [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 972.558301] env[59490]: DEBUG oslo_vmware.rw_handles [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 972.624897] env[59490]: DEBUG neutronclient.v2_0.client [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 972.626777] env[59490]: ERROR nova.compute.manager [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] result = getattr(controller, method)(*args, **kwargs) [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._get(image_id) [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 972.626777] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] resp, body = self.http_client.get(url, headers=header) [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.request(url, 'GET', **kwargs) [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._handle_response(resp) [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exc.from_response(resp, resp.content) [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] During handling of the above exception, another exception occurred: [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.627176] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.driver.spawn(context, instance, image_meta, [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._fetch_image_if_missing(context, vi) [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image_fetch(context, vi, tmp_image_ds_loc) [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] images.fetch_image( [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] metadata = IMAGE_API.get(context, image_ref) [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 972.627501] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return session.show(context, image_id, [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] _reraise_translated_image_exception(image_id) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise new_exc.with_traceback(exc_trace) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] result = getattr(controller, method)(*args, **kwargs) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._get(image_id) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 972.627845] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] resp, body = self.http_client.get(url, headers=header) [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.request(url, 'GET', **kwargs) [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._handle_response(resp) [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exc.from_response(resp, resp.content) [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] During handling of the above exception, another exception occurred: [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.628184] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._build_and_run_instance(context, instance, image, [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] with excutils.save_and_reraise_exception(): [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.force_reraise() [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise self.value [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] with self.rt.instance_claim(context, instance, node, allocs, [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.abort() [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 972.628531] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return f(*args, **kwargs) [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._unset_instance_host_and_node(instance) [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] instance.save() [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] updates, result = self.indirection_api.object_action( [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return cctxt.call(context, 'object_action', objinst=objinst, [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 972.628915] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] result = self.transport._send( [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._driver.send(target, ctxt, message, [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise result [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] nova.exception_Remote.InstanceNotFound_Remote: Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 could not be found. [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return getattr(target, method)(*args, **kwargs) [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629239] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return fn(self, *args, **kwargs) [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] old_ref, inst_ref = db.instance_update_and_get_original( [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return f(*args, **kwargs) [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] with excutils.save_and_reraise_exception() as ectxt: [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.force_reraise() [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629575] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise self.value [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return f(*args, **kwargs) [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return f(context, *args, **kwargs) [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exception.InstanceNotFound(instance_id=uuid) [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.629952] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] nova.exception.InstanceNotFound: Instance 0ec55812-86b7-44ef-822a-88a2ff1816c3 could not be found. [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] During handling of the above exception, another exception occurred: [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] exception_handler_v20(status_code, error_body) [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise client_exc(message=error_message, [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Neutron server returns request_ids: ['req-7ed75fa0-d444-4f82-be7f-ed0a0d057bd5'] [ 972.630827] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] During handling of the above exception, another exception occurred: [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Traceback (most recent call last): [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._deallocate_network(context, instance, requested_networks) [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self.network_api.deallocate_for_instance( [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] data = neutron.list_ports(**search_opts) [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 972.631222] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.list('ports', self.ports_path, retrieve_all, [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] for r in self._pagination(collection, path, **params): [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] res = self.get(path, params=params) [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.retry_request("GET", action, body=body, [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.631557] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] return self.do_request(method, action, body=body, [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] ret = obj(*args, **kwargs) [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] self._handle_fault_response(status_code, replybody, resp) [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] raise exception.Unauthorized() [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] nova.exception.Unauthorized: Not authorized. [ 972.631887] env[59490]: ERROR nova.compute.manager [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] [ 972.657769] env[59490]: DEBUG oslo_concurrency.lockutils [None req-4863f235-3544-45e7-920e-a35f3fa728f4 tempest-ServerMetadataTestJSON-1508552220 tempest-ServerMetadataTestJSON-1508552220-project-member] Lock "0ec55812-86b7-44ef-822a-88a2ff1816c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 302.791s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 972.671059] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 972.718360] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 972.718600] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 972.720075] env[59490]: INFO nova.compute.claims [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 972.941920] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c921d387-6d02-4b79-bf83-9dcca4d883aa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.949391] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54c5bb26-75ee-42d6-94cb-84b960a7ff24 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.980742] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-797b4385-b48c-4911-8e98-ef1035a88763 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 972.987969] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a3b25ba-a38c-4cab-9f83-9ffc92db1f6b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.000887] env[59490]: DEBUG nova.compute.provider_tree [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 973.009409] env[59490]: DEBUG nova.scheduler.client.report [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 973.021981] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.303s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.022462] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 973.059382] env[59490]: DEBUG nova.compute.utils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 973.061053] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 973.061270] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 973.072508] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 973.137429] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 973.150498] env[59490]: DEBUG nova.policy [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8149f901ee5c42d488e9010862c2f003', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9c82d4ee7f154795b0d110a31b975096', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 973.161487] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 973.161709] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 973.161855] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 973.162048] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 973.162196] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 973.162333] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 973.162530] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 973.162682] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 973.162841] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 973.162992] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 973.163171] env[59490]: DEBUG nova.virt.hardware [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 973.164023] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f31619b6-19b9-4b30-9738-43030788708d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.171730] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-850217d3-1575-42f6-b493-d5da980cb24a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.186715] env[59490]: DEBUG nova.compute.manager [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Received event network-vif-plugged-3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 973.186927] env[59490]: DEBUG oslo_concurrency.lockutils [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] Acquiring lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 973.187119] env[59490]: DEBUG oslo_concurrency.lockutils [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 973.187272] env[59490]: DEBUG oslo_concurrency.lockutils [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 973.187450] env[59490]: DEBUG nova.compute.manager [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] No waiting events found dispatching network-vif-plugged-3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 973.187627] env[59490]: WARNING nova.compute.manager [req-3bcbac69-49c6-49d0-ae74-083c7147a562 req-96de7c49-90c6-4b84-a488-6ab19b0df70b service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Received unexpected event network-vif-plugged-3e01dba7-2f19-4e79-b0f7-cea68eeb8065 for instance with vm_state building and task_state spawning. [ 973.362285] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Successfully updated port: 3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 973.378665] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 973.378811] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 973.378958] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 973.444336] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 973.659906] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Successfully created port: 302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 973.719149] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Updating instance_info_cache with network_info: [{"id": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "address": "fa:16:3e:b7:26:3c", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e01dba7-2f", "ovs_interfaceid": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 973.731019] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 973.731113] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance network_info: |[{"id": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "address": "fa:16:3e:b7:26:3c", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e01dba7-2f", "ovs_interfaceid": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 973.731853] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b7:26:3c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3ac3fd84-c373-49f5-82dc-784a6cdb686d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3e01dba7-2f19-4e79-b0f7-cea68eeb8065', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 973.744033] env[59490]: DEBUG oslo.service.loopingcall [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 973.744647] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 973.744943] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6b1c6aeb-a61e-4173-8676-37b02f68cf41 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 973.769287] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 973.769287] env[59490]: value = "task-707436" [ 973.769287] env[59490]: _type = "Task" [ 973.769287] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 973.777063] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707436, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 974.155331] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Successfully created port: 622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 974.280458] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707436, 'name': CreateVM_Task, 'duration_secs': 0.294026} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 974.280636] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 974.281315] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 974.281470] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 974.281898] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 974.282154] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5299693d-63d1-4c08-9e2f-b0ccaa220a6a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 974.287745] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 974.287745] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]524536f1-02cf-7f6f-0380-193b080b0cea" [ 974.287745] env[59490]: _type = "Task" [ 974.287745] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 974.296621] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]524536f1-02cf-7f6f-0380-193b080b0cea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 974.784495] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Successfully updated port: 302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 974.799854] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 974.800109] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 974.800311] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 975.213664] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Received event network-changed-3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 975.213850] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Refreshing instance network info cache due to event network-changed-3e01dba7-2f19-4e79-b0f7-cea68eeb8065. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 975.214228] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Acquiring lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 975.214426] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Acquired lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 975.214595] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Refreshing network info cache for port 3e01dba7-2f19-4e79-b0f7-cea68eeb8065 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 975.486355] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Successfully updated port: 622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 975.498419] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 975.498567] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquired lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 975.498792] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 975.575550] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 975.748321] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Updated VIF entry in instance network info cache for port 3e01dba7-2f19-4e79-b0f7-cea68eeb8065. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 975.748666] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Updating instance_info_cache with network_info: [{"id": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "address": "fa:16:3e:b7:26:3c", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3e01dba7-2f", "ovs_interfaceid": "3e01dba7-2f19-4e79-b0f7-cea68eeb8065", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 975.757554] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Releasing lock "refresh_cache-2907e146-ad50-47f3-9390-7ae3ae99ce97" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 975.757811] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-vif-plugged-302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 975.757990] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Acquiring lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 975.758198] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 975.758349] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 975.758504] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] No waiting events found dispatching network-vif-plugged-302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 975.758663] env[59490]: WARNING nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received unexpected event network-vif-plugged-302b78eb-2b54-407c-b685-09c8f1da1100 for instance with vm_state building and task_state spawning. [ 975.758814] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-changed-302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 975.758956] env[59490]: DEBUG nova.compute.manager [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Refreshing instance network info cache due to event network-changed-302b78eb-2b54-407c-b685-09c8f1da1100. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 975.759123] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Acquiring lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 975.969419] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updating instance_info_cache with network_info: [{"id": "302b78eb-2b54-407c-b685-09c8f1da1100", "address": "fa:16:3e:92:a4:2a", "network": {"id": "0a2b28eb-fd9a-4e96-b1ab-a1422e9505de", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-999710046", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.67", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "503991c4-44d0-42d9-aa03-5259331f1051", "external-id": "nsx-vlan-transportzone-3", "segmentation_id": 3, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap302b78eb-2b", "ovs_interfaceid": "302b78eb-2b54-407c-b685-09c8f1da1100", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "622c9619-1870-4434-aee0-8d5ab7122977", "address": "fa:16:3e:c4:f6:e9", "network": {"id": "c09dc8c1-02e0-47ec-9568-0588a56920b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2139108784", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap622c9619-18", "ovs_interfaceid": "622c9619-1870-4434-aee0-8d5ab7122977", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 975.980326] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Releasing lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 975.980617] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance network_info: |[{"id": "302b78eb-2b54-407c-b685-09c8f1da1100", "address": "fa:16:3e:92:a4:2a", "network": {"id": "0a2b28eb-fd9a-4e96-b1ab-a1422e9505de", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-999710046", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.67", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "503991c4-44d0-42d9-aa03-5259331f1051", "external-id": "nsx-vlan-transportzone-3", "segmentation_id": 3, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap302b78eb-2b", "ovs_interfaceid": "302b78eb-2b54-407c-b685-09c8f1da1100", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "622c9619-1870-4434-aee0-8d5ab7122977", "address": "fa:16:3e:c4:f6:e9", "network": {"id": "c09dc8c1-02e0-47ec-9568-0588a56920b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2139108784", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap622c9619-18", "ovs_interfaceid": "622c9619-1870-4434-aee0-8d5ab7122977", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 975.982646] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Acquired lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 975.982646] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Refreshing network info cache for port 302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 975.982646] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:92:a4:2a', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '503991c4-44d0-42d9-aa03-5259331f1051', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '302b78eb-2b54-407c-b685-09c8f1da1100', 'vif_model': 'vmxnet3'}, {'network_name': 'br-int', 'mac_address': 'fa:16:3e:c4:f6:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '13e83154-c0d2-4d3d-b95e-3cd5ba336257', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '622c9619-1870-4434-aee0-8d5ab7122977', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 975.992488] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Creating folder: Project (9c82d4ee7f154795b0d110a31b975096). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 975.993369] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0662993f-31c4-43a9-a76f-ee52045d7bf9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.005862] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Created folder: Project (9c82d4ee7f154795b0d110a31b975096) in parent group-v168905. [ 976.006048] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Creating folder: Instances. Parent ref: group-v168961. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 976.006262] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d08371f-a6fc-47b9-80d5-b137aa7b1856 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.014441] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Created folder: Instances in parent group-v168961. [ 976.014641] env[59490]: DEBUG oslo.service.loopingcall [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 976.014799] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 976.014973] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-04ba7591-f059-4a3a-ab82-dbbef9e10a4c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.038071] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 976.038071] env[59490]: value = "task-707439" [ 976.038071] env[59490]: _type = "Task" [ 976.038071] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.046518] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707439, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 976.376482] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updated VIF entry in instance network info cache for port 302b78eb-2b54-407c-b685-09c8f1da1100. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 976.376844] env[59490]: DEBUG nova.network.neutron [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updating instance_info_cache with network_info: [{"id": "302b78eb-2b54-407c-b685-09c8f1da1100", "address": "fa:16:3e:92:a4:2a", "network": {"id": "0a2b28eb-fd9a-4e96-b1ab-a1422e9505de", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-999710046", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.67", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "503991c4-44d0-42d9-aa03-5259331f1051", "external-id": "nsx-vlan-transportzone-3", "segmentation_id": 3, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap302b78eb-2b", "ovs_interfaceid": "302b78eb-2b54-407c-b685-09c8f1da1100", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "622c9619-1870-4434-aee0-8d5ab7122977", "address": "fa:16:3e:c4:f6:e9", "network": {"id": "c09dc8c1-02e0-47ec-9568-0588a56920b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2139108784", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap622c9619-18", "ovs_interfaceid": "622c9619-1870-4434-aee0-8d5ab7122977", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 976.386874] env[59490]: DEBUG oslo_concurrency.lockutils [req-d20a4a74-0f1d-4e18-a3eb-732ed6936819 req-db1c2ecd-5f16-44ef-8e30-d1d12bc66312 service nova] Releasing lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 976.548112] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707439, 'name': CreateVM_Task, 'duration_secs': 0.323823} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 976.548274] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 976.549017] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 976.549177] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 976.549483] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 976.549722] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a8797fcf-5e6d-40e4-8d4b-71df8698f3f1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 976.553994] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Waiting for the task: (returnval){ [ 976.553994] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52267392-1bfb-84dd-3e0d-b68eec94ecb6" [ 976.553994] env[59490]: _type = "Task" [ 976.553994] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 976.561221] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52267392-1bfb-84dd-3e0d-b68eec94ecb6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 977.064645] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 977.065032] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 977.065148] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 977.238940] env[59490]: DEBUG nova.compute.manager [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-vif-plugged-622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 977.239180] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Acquiring lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 977.239442] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 977.239529] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 977.239687] env[59490]: DEBUG nova.compute.manager [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] No waiting events found dispatching network-vif-plugged-622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 977.239845] env[59490]: WARNING nova.compute.manager [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received unexpected event network-vif-plugged-622c9619-1870-4434-aee0-8d5ab7122977 for instance with vm_state building and task_state spawning. [ 977.239991] env[59490]: DEBUG nova.compute.manager [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-changed-622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 977.240337] env[59490]: DEBUG nova.compute.manager [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Refreshing instance network info cache due to event network-changed-622c9619-1870-4434-aee0-8d5ab7122977. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 977.240518] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Acquiring lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 977.240648] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Acquired lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 977.240794] env[59490]: DEBUG nova.network.neutron [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Refreshing network info cache for port 622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 977.517705] env[59490]: DEBUG nova.network.neutron [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updated VIF entry in instance network info cache for port 622c9619-1870-4434-aee0-8d5ab7122977. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 977.518108] env[59490]: DEBUG nova.network.neutron [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updating instance_info_cache with network_info: [{"id": "302b78eb-2b54-407c-b685-09c8f1da1100", "address": "fa:16:3e:92:a4:2a", "network": {"id": "0a2b28eb-fd9a-4e96-b1ab-a1422e9505de", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-999710046", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.67", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "503991c4-44d0-42d9-aa03-5259331f1051", "external-id": "nsx-vlan-transportzone-3", "segmentation_id": 3, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap302b78eb-2b", "ovs_interfaceid": "302b78eb-2b54-407c-b685-09c8f1da1100", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "622c9619-1870-4434-aee0-8d5ab7122977", "address": "fa:16:3e:c4:f6:e9", "network": {"id": "c09dc8c1-02e0-47ec-9568-0588a56920b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2139108784", "subnets": [{"cidr": "192.168.129.0/24", "dns": [], "gateway": {"address": "192.168.129.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.129.214", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.129.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "13e83154-c0d2-4d3d-b95e-3cd5ba336257", "external-id": "nsx-vlan-transportzone-771", "segmentation_id": 771, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap622c9619-18", "ovs_interfaceid": "622c9619-1870-4434-aee0-8d5ab7122977", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 977.527148] env[59490]: DEBUG oslo_concurrency.lockutils [req-601d4c81-a7cf-4dbe-a837-38abc3e7a598 req-77d175d1-b9b6-46b6-94e6-95fb15feb1f8 service nova] Releasing lock "refresh_cache-f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1006.383996] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1006.384374] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1007.384694] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1008.384442] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1008.397793] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1008.398059] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1008.398153] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1008.398298] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1008.399361] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46c4c399-5058-4125-9811-57b92ea5028e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.408018] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f22945a-7466-4f13-9fd4-8c8d772cb388 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.423892] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfb0bd12-f2e8-45d9-b1b0-4cb103f5ca1d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.433180] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de96e5d6-e4d1-4625-828f-e12e64c5b407 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.464469] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181656MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1008.464632] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1008.464814] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1008.508188] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1008.508360] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1008.519642] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ddbac2db-c555-4554-aa21-7303c8e36371 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.530221] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.540699] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e879cc90-f290-42cd-9059-46f42284a32c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.550714] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.560915] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f4bbfad2-f118-4292-bb36-4229c333dd4c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.570708] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 014bca6d-9df7-4245-90b4-3f291262292a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.580617] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d0673be9-d670-4d3f-aefa-26f4e336a695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.590299] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ecb7312c-80f0-490e-8357-7138680d0f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.600601] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e24d5bbc-6168-4523-9a0c-cd29c14c9e56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.610620] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 643bfd74-592a-452c-af62-ded4c23009f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1008.610826] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 2 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1008.610966] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=768MB phys_disk=200GB used_disk=2GB total_vcpus=48 used_vcpus=2 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1008.758307] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-501d492e-23be-46cc-ae38-3e5385c4c177 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.765906] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a087367e-1e20-4ec4-b1be-b855712a48a6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.796077] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b561e3c-76e7-481f-b693-44dda24508d7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.803060] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-441e5777-4316-4e2a-b9b1-2110644f7fa5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1008.815547] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1008.823264] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1008.835436] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1008.835605] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.371s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1010.829876] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1010.830170] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1011.384570] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1011.384748] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1011.384868] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1011.396469] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1011.396659] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1011.396836] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1011.397302] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1012.383943] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1012.396077] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1016.383562] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1019.485732] env[59490]: WARNING oslo_vmware.rw_handles [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1019.485732] env[59490]: ERROR oslo_vmware.rw_handles [ 1019.486580] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1019.487985] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1019.488237] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Copying Virtual Disk [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/f482a144-43a2-49a7-ad5d-6326a230ac4e/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1019.488507] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-3e9bed91-225e-4c79-902f-a4345c33ac0e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1019.495929] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Waiting for the task: (returnval){ [ 1019.495929] env[59490]: value = "task-707440" [ 1019.495929] env[59490]: _type = "Task" [ 1019.495929] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1019.504035] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Task: {'id': task-707440, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1019.883900] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1020.007353] env[59490]: DEBUG oslo_vmware.exceptions [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1020.007610] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1020.008145] env[59490]: ERROR nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1020.008145] env[59490]: Faults: ['InvalidArgument'] [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] yield resources [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.driver.spawn(context, instance, image_meta, [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._fetch_image_if_missing(context, vi) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] image_cache(vi, tmp_image_ds_loc) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] vm_util.copy_virtual_disk( [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] session._wait_for_task(vmdk_copy_task) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.wait_for_task(task_ref) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return evt.wait() [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] result = hub.switch() [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.greenlet.switch() [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.f(*self.args, **self.kw) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise exceptions.translate_fault(task_info.error) [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Faults: ['InvalidArgument'] [ 1020.008145] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1020.009082] env[59490]: INFO nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Terminating instance [ 1020.011007] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1020.011210] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1020.011467] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1020.011652] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1020.012340] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f10f9ee5-33f8-41eb-a9d9-004fa0e6e369 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.014815] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f7a70e1-6951-4477-82aa-9f60b7270437 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.632322] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1020.633494] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8e7c46ef-6db0-48d5-91e1-ba3293f2792b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.634991] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1020.635169] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1020.635814] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7c1bac02-8bd2-45d1-98ba-e85ffe4e36c5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.641229] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Waiting for the task: (returnval){ [ 1020.641229] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52a98d1f-d012-1939-f0ed-a5b673673764" [ 1020.641229] env[59490]: _type = "Task" [ 1020.641229] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1020.648222] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52a98d1f-d012-1939-f0ed-a5b673673764, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1020.703589] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1020.703792] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1020.703977] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Deleting the datastore file [datastore2] 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1020.704366] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-49120b53-ad71-41bb-ad91-702ac5a78ec1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1020.711095] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Waiting for the task: (returnval){ [ 1020.711095] env[59490]: value = "task-707442" [ 1020.711095] env[59490]: _type = "Task" [ 1020.711095] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1020.719101] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Task: {'id': task-707442, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1021.151383] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1021.151615] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Creating directory with path [datastore2] vmware_temp/91b1c258-e703-479a-a440-e6356b1deb61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1021.151833] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1d423757-4bc3-4d80-8e26-85aec9d1144f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.162491] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Created directory with path [datastore2] vmware_temp/91b1c258-e703-479a-a440-e6356b1deb61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1021.162665] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Fetch image to [datastore2] vmware_temp/91b1c258-e703-479a-a440-e6356b1deb61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1021.162820] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/91b1c258-e703-479a-a440-e6356b1deb61/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1021.163500] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d4e8ec4-6053-4137-b11d-bf80904e4dae {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.169855] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c191f5-b185-4b01-b664-2d14ded8b56f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.178529] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-171e5ce8-24c3-4bda-9974-ac6b29ce6f81 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.208014] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c82c800-3d08-4e94-9819-14a39e3e1066 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.216570] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8cf73e8b-17d6-4c3d-933f-cecf5b041f9a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.220671] env[59490]: DEBUG oslo_vmware.api [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Task: {'id': task-707442, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065723} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1021.221260] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1021.221432] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1021.221616] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1021.221776] env[59490]: INFO nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Took 1.21 seconds to destroy the instance on the hypervisor. [ 1021.223738] env[59490]: DEBUG nova.compute.claims [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1021.223886] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.224100] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.240703] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1021.250578] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.251230] env[59490]: DEBUG nova.compute.utils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1021.252750] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1021.253082] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1021.253082] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1021.253244] env[59490]: DEBUG nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1021.253379] env[59490]: DEBUG nova.network.neutron [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1021.289242] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1021.289955] env[59490]: ERROR nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] result = getattr(controller, method)(*args, **kwargs) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._get(image_id) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] resp, body = self.http_client.get(url, headers=header) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.request(url, 'GET', **kwargs) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._handle_response(resp) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exc.from_response(resp, resp.content) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] During handling of the above exception, another exception occurred: [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] yield resources [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.driver.spawn(context, instance, image_meta, [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._fetch_image_if_missing(context, vi) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image_fetch(context, vi, tmp_image_ds_loc) [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] images.fetch_image( [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1021.289955] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] metadata = IMAGE_API.get(context, image_ref) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return session.show(context, image_id, [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] _reraise_translated_image_exception(image_id) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise new_exc.with_traceback(exc_trace) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] result = getattr(controller, method)(*args, **kwargs) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._get(image_id) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] resp, body = self.http_client.get(url, headers=header) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.request(url, 'GET', **kwargs) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._handle_response(resp) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exc.from_response(resp, resp.content) [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1021.291590] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.291590] env[59490]: INFO nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Terminating instance [ 1021.292290] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1021.292290] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1021.292439] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1021.292614] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1021.292828] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-524c4482-ca60-4fd8-9849-a99e92e2fe65 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.295245] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e220aa0-51ef-4ea3-ac48-4787cf35ecc4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.302376] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1021.302571] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0fca4962-09f4-4e52-9c14-d236f7ff272e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.304825] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1021.304914] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1021.305818] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5154a0aa-f3ce-4d36-84e8-8b6175ce34fd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.310367] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Waiting for the task: (returnval){ [ 1021.310367] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]529c805a-3b87-22a1-355b-f3700278942b" [ 1021.310367] env[59490]: _type = "Task" [ 1021.310367] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1021.318860] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]529c805a-3b87-22a1-355b-f3700278942b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1021.362542] env[59490]: DEBUG neutronclient.v2_0.client [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1021.363992] env[59490]: ERROR nova.compute.manager [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.driver.spawn(context, instance, image_meta, [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._fetch_image_if_missing(context, vi) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] image_cache(vi, tmp_image_ds_loc) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] vm_util.copy_virtual_disk( [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] session._wait_for_task(vmdk_copy_task) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.wait_for_task(task_ref) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return evt.wait() [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] result = hub.switch() [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.greenlet.switch() [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.f(*self.args, **self.kw) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise exceptions.translate_fault(task_info.error) [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Faults: ['InvalidArgument'] [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] During handling of the above exception, another exception occurred: [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._build_and_run_instance(context, instance, image, [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] with excutils.save_and_reraise_exception(): [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.force_reraise() [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.363992] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise self.value [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] with self.rt.instance_claim(context, instance, node, allocs, [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.abort() [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return f(*args, **kwargs) [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._unset_instance_host_and_node(instance) [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] instance.save() [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] updates, result = self.indirection_api.object_action( [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return cctxt.call(context, 'object_action', objinst=objinst, [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] result = self.transport._send( [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self._driver.send(target, ctxt, message, [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise result [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] nova.exception_Remote.InstanceNotFound_Remote: Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 could not be found. [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return getattr(target, method)(*args, **kwargs) [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return fn(self, *args, **kwargs) [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] old_ref, inst_ref = db.instance_update_and_get_original( [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return f(*args, **kwargs) [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1021.365587] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] with excutils.save_and_reraise_exception() as ectxt: [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.force_reraise() [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise self.value [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return f(*args, **kwargs) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return f(context, *args, **kwargs) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise exception.InstanceNotFound(instance_id=uuid) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] nova.exception.InstanceNotFound: Instance 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7 could not be found. [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] During handling of the above exception, another exception occurred: [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] exception_handler_v20(status_code, error_body) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise client_exc(message=error_message, [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Neutron server returns request_ids: ['req-fa7829e0-83f6-43a3-b13b-2d3982555ed9'] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] During handling of the above exception, another exception occurred: [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Traceback (most recent call last): [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._deallocate_network(context, instance, requested_networks) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self.network_api.deallocate_for_instance( [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] data = neutron.list_ports(**search_opts) [ 1021.367807] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.list('ports', self.ports_path, retrieve_all, [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] for r in self._pagination(collection, path, **params): [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] res = self.get(path, params=params) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.retry_request("GET", action, body=body, [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] return self.do_request(method, action, body=body, [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] ret = obj(*args, **kwargs) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] self._handle_fault_response(status_code, replybody, resp) [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] raise exception.Unauthorized() [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] nova.exception.Unauthorized: Not authorized. [ 1021.369214] env[59490]: ERROR nova.compute.manager [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] [ 1021.373992] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1021.374217] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1021.374419] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Deleting the datastore file [datastore2] 31207de9-e903-4ed4-bccc-c0796edec34b {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1021.375212] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-81b8fe29-56d6-45c0-a130-b9ae5b33de40 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.384172] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Waiting for the task: (returnval){ [ 1021.384172] env[59490]: value = "task-707444" [ 1021.384172] env[59490]: _type = "Task" [ 1021.384172] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1021.388561] env[59490]: DEBUG oslo_concurrency.lockutils [None req-f4bed56e-a549-4a2b-9134-ad5ec5fdabe8 tempest-ServerDiskConfigTestJSON-124999452 tempest-ServerDiskConfigTestJSON-124999452-project-member] Lock "67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 344.449s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.394587] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Task: {'id': task-707444, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1021.398017] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1021.443511] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.443764] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.445280] env[59490]: INFO nova.compute.claims [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1021.623354] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d2ae158-7e4e-4633-b855-ebb8f7601867 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.630501] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92ecbb9f-2660-495e-87d3-de7ac7f14aa7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.659764] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f35bdf73-7143-4107-ae7a-7a9d878bfb1b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.666222] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-184a5ce7-9e43-4fe1-8936-39ed25faf1c0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.678503] env[59490]: DEBUG nova.compute.provider_tree [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1021.687174] env[59490]: DEBUG nova.scheduler.client.report [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1021.702248] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.258s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.702844] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1021.733613] env[59490]: DEBUG nova.compute.utils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1021.735011] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1021.735565] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1021.744025] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1021.795051] env[59490]: DEBUG nova.policy [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '32fffc7664814bdba81ed340d27e444c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5530a0bb6d434878aed7b9c96009b416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1021.808180] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1021.823331] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1021.823585] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Creating directory with path [datastore2] vmware_temp/0ef3fc49-e718-4bc8-a709-8dfb4c4b0668/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1021.823975] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-83f864f7-c745-449b-9a78-92def2263f81 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.830441] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1021.830702] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1021.830908] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1021.831199] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1021.831323] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1021.831521] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1021.831768] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1021.831947] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1021.832178] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1021.832393] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1021.832609] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1021.833456] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93452789-ca24-409b-a1de-44468b6c93e0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.836845] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Created directory with path [datastore2] vmware_temp/0ef3fc49-e718-4bc8-a709-8dfb4c4b0668/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1021.837021] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Fetch image to [datastore2] vmware_temp/0ef3fc49-e718-4bc8-a709-8dfb4c4b0668/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1021.837184] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/0ef3fc49-e718-4bc8-a709-8dfb4c4b0668/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1021.838280] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-234797aa-4a55-461f-8c73-419c7aa9b96f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.844074] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4206207b-7597-4f6a-963b-92dd6db050c1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.850413] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f4a245-59f7-4aa4-8bef-e876305ffe5e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.866217] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed51a5c8-b13a-4bc6-90a4-890c801c8ad3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.901828] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50f0a981-2138-4181-9db8-b209191c0d7d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.908726] env[59490]: DEBUG oslo_vmware.api [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Task: {'id': task-707444, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063859} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1021.910995] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1021.910995] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1021.910995] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1021.910995] env[59490]: INFO nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1021.913884] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3f6a2511-70e2-4d22-a48a-7c50bcefe537 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1021.914421] env[59490]: DEBUG nova.compute.claims [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1021.914466] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1021.914649] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1021.935594] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1021.942410] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1021.943090] env[59490]: DEBUG nova.compute.utils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance 31207de9-e903-4ed4-bccc-c0796edec34b could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1021.944492] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1021.944667] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1021.944821] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1021.944980] env[59490]: DEBUG nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1021.945189] env[59490]: DEBUG nova.network.neutron [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1021.986927] env[59490]: DEBUG neutronclient.v2_0.client [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1021.988673] env[59490]: ERROR nova.compute.manager [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] result = getattr(controller, method)(*args, **kwargs) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._get(image_id) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] resp, body = self.http_client.get(url, headers=header) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.request(url, 'GET', **kwargs) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._handle_response(resp) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exc.from_response(resp, resp.content) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] During handling of the above exception, another exception occurred: [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.driver.spawn(context, instance, image_meta, [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._fetch_image_if_missing(context, vi) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image_fetch(context, vi, tmp_image_ds_loc) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] images.fetch_image( [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] metadata = IMAGE_API.get(context, image_ref) [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return session.show(context, image_id, [ 1021.988673] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] _reraise_translated_image_exception(image_id) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise new_exc.with_traceback(exc_trace) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] result = getattr(controller, method)(*args, **kwargs) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._get(image_id) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] resp, body = self.http_client.get(url, headers=header) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.request(url, 'GET', **kwargs) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._handle_response(resp) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exc.from_response(resp, resp.content) [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] During handling of the above exception, another exception occurred: [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._build_and_run_instance(context, instance, image, [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] with excutils.save_and_reraise_exception(): [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.force_reraise() [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise self.value [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] with self.rt.instance_claim(context, instance, node, allocs, [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.abort() [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1021.989919] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return f(*args, **kwargs) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._unset_instance_host_and_node(instance) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] instance.save() [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] updates, result = self.indirection_api.object_action( [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return cctxt.call(context, 'object_action', objinst=objinst, [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] result = self.transport._send( [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._driver.send(target, ctxt, message, [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise result [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] nova.exception_Remote.InstanceNotFound_Remote: Instance 31207de9-e903-4ed4-bccc-c0796edec34b could not be found. [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return getattr(target, method)(*args, **kwargs) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return fn(self, *args, **kwargs) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] old_ref, inst_ref = db.instance_update_and_get_original( [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return f(*args, **kwargs) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] with excutils.save_and_reraise_exception() as ectxt: [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.force_reraise() [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise self.value [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return f(*args, **kwargs) [ 1021.991061] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return f(context, *args, **kwargs) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exception.InstanceNotFound(instance_id=uuid) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] nova.exception.InstanceNotFound: Instance 31207de9-e903-4ed4-bccc-c0796edec34b could not be found. [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] During handling of the above exception, another exception occurred: [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] exception_handler_v20(status_code, error_body) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise client_exc(message=error_message, [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Neutron server returns request_ids: ['req-c11fc659-019d-4c5c-98c8-1753dfc924bd'] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] During handling of the above exception, another exception occurred: [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Traceback (most recent call last): [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._deallocate_network(context, instance, requested_networks) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self.network_api.deallocate_for_instance( [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] data = neutron.list_ports(**search_opts) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.list('ports', self.ports_path, retrieve_all, [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1021.995596] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] for r in self._pagination(collection, path, **params): [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] res = self.get(path, params=params) [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.retry_request("GET", action, body=body, [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] return self.do_request(method, action, body=body, [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] ret = obj(*args, **kwargs) [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] self._handle_fault_response(status_code, replybody, resp) [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] raise exception.Unauthorized() [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] nova.exception.Unauthorized: Not authorized. [ 1021.997019] env[59490]: ERROR nova.compute.manager [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] [ 1022.021705] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80a61b97-9564-4d78-9006-7d1ede3af179 tempest-InstanceActionsNegativeTestJSON-1959984193 tempest-InstanceActionsNegativeTestJSON-1959984193-project-member] Lock "31207de9-e903-4ed4-bccc-c0796edec34b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.236s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.032695] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1022.054836] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1022.055592] env[59490]: ERROR nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] result = getattr(controller, method)(*args, **kwargs) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._get(image_id) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] resp, body = self.http_client.get(url, headers=header) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.request(url, 'GET', **kwargs) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._handle_response(resp) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exc.from_response(resp, resp.content) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] During handling of the above exception, another exception occurred: [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] yield resources [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.driver.spawn(context, instance, image_meta, [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._fetch_image_if_missing(context, vi) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image_fetch(context, vi, tmp_image_ds_loc) [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] images.fetch_image( [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1022.055592] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] metadata = IMAGE_API.get(context, image_ref) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return session.show(context, image_id, [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] _reraise_translated_image_exception(image_id) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise new_exc.with_traceback(exc_trace) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] result = getattr(controller, method)(*args, **kwargs) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._get(image_id) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] resp, body = self.http_client.get(url, headers=header) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.request(url, 'GET', **kwargs) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._handle_response(resp) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exc.from_response(resp, resp.content) [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1022.056732] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1022.056732] env[59490]: INFO nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Terminating instance [ 1022.057473] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1022.057682] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.058406] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1022.058480] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1022.058676] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-110e2ac8-a00b-49e9-9938-287afe21f759 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.061314] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e6b2a62f-c9bd-403c-a50b-4a612fdc8d97 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.075351] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1022.079134] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-67839ddf-617f-44cc-8365-3f75c5f2137e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.080828] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.080992] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1022.084555] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-53dea256-0096-4cd6-be25-a4a6b49e446d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.090295] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Waiting for the task: (returnval){ [ 1022.090295] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52787eb7-32bd-8ae9-88d8-32cccc35c590" [ 1022.090295] env[59490]: _type = "Task" [ 1022.090295] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.097847] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52787eb7-32bd-8ae9-88d8-32cccc35c590, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.101516] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.101733] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.103292] env[59490]: INFO nova.compute.claims [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1022.181447] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Successfully created port: 194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1022.319111] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a748196e-1a35-4336-99a5-60d9449ddfc7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.326766] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8853288-0059-4164-ab5b-bce7200cc21e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.357423] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fa29c3b-0561-4a35-9431-744a00d3724b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.364618] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f974200-8f0b-48c8-b0b1-3935a039dded {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.377369] env[59490]: DEBUG nova.compute.provider_tree [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1022.388898] env[59490]: DEBUG nova.scheduler.client.report [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1022.401601] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.300s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.402062] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1022.433766] env[59490]: DEBUG nova.compute.utils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1022.435025] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1022.435184] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1022.443913] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1022.508620] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1022.515424] env[59490]: DEBUG nova.policy [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '32fffc7664814bdba81ed340d27e444c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5530a0bb6d434878aed7b9c96009b416', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1022.542359] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1022.542603] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1022.542750] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1022.542927] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1022.543077] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1022.543222] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1022.543425] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1022.543578] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1022.543738] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1022.543893] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1022.545144] env[59490]: DEBUG nova.virt.hardware [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1022.546085] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-53782a7b-8120-4e65-be9e-91447c5bad93 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.554107] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b8d353-5724-49df-8dc2-3ad571da6223 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.599909] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1022.600173] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Creating directory with path [datastore2] vmware_temp/46c3e5b8-8f63-4f9a-a0b4-4a76bdbb8a2d/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.600389] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5f0434b3-2a4e-4814-b6de-da9a54cc2b7e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.620117] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Created directory with path [datastore2] vmware_temp/46c3e5b8-8f63-4f9a-a0b4-4a76bdbb8a2d/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.620297] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Fetch image to [datastore2] vmware_temp/46c3e5b8-8f63-4f9a-a0b4-4a76bdbb8a2d/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1022.620710] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/46c3e5b8-8f63-4f9a-a0b4-4a76bdbb8a2d/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1022.621202] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b795d664-425f-4247-a9b4-b7a119312069 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.634033] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b92a300-4148-4218-8914-7d1f97ba3973 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.644282] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9682722f-edac-4349-94df-887792a645e5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.673776] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3bea053d-6e1b-4664-890f-9abb5e6daa46 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.679340] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f498c91e-6d06-49ed-b54c-07f217ef856c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.699484] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1022.723759] env[59490]: DEBUG nova.compute.manager [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Received event network-vif-plugged-194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1022.724209] env[59490]: DEBUG oslo_concurrency.lockutils [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] Acquiring lock "ddbac2db-c555-4554-aa21-7303c8e36371-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1022.724848] env[59490]: DEBUG oslo_concurrency.lockutils [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] Lock "ddbac2db-c555-4554-aa21-7303c8e36371-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1022.725158] env[59490]: DEBUG oslo_concurrency.lockutils [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] Lock "ddbac2db-c555-4554-aa21-7303c8e36371-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1022.725399] env[59490]: DEBUG nova.compute.manager [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] No waiting events found dispatching network-vif-plugged-194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1022.725675] env[59490]: WARNING nova.compute.manager [req-46ec2b2d-aebb-49f3-9e51-96396ae6db08 req-0e7f559b-759a-4f9d-84c0-af18c48a0848 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Received unexpected event network-vif-plugged-194fb95e-e711-4985-b42b-b1ffe7bf588e for instance with vm_state building and task_state spawning. [ 1022.784821] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1022.785059] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1022.785233] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Deleting the datastore file [datastore2] 2f083456-3eb9-4022-86a3-8d39f83c470f {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1022.785491] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e7fd8de2-6e59-4fa8-b4c1-4048bd186a07 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.792226] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Waiting for the task: (returnval){ [ 1022.792226] env[59490]: value = "task-707446" [ 1022.792226] env[59490]: _type = "Task" [ 1022.792226] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.802252] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Task: {'id': task-707446, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.807757] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1022.808527] env[59490]: ERROR nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] result = getattr(controller, method)(*args, **kwargs) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._get(image_id) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] resp, body = self.http_client.get(url, headers=header) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.request(url, 'GET', **kwargs) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._handle_response(resp) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exc.from_response(resp, resp.content) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] During handling of the above exception, another exception occurred: [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] yield resources [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.driver.spawn(context, instance, image_meta, [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._fetch_image_if_missing(context, vi) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image_fetch(context, vi, tmp_image_ds_loc) [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] images.fetch_image( [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1022.808527] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] metadata = IMAGE_API.get(context, image_ref) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return session.show(context, image_id, [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] _reraise_translated_image_exception(image_id) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise new_exc.with_traceback(exc_trace) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] result = getattr(controller, method)(*args, **kwargs) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._get(image_id) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] resp, body = self.http_client.get(url, headers=header) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.request(url, 'GET', **kwargs) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._handle_response(resp) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exc.from_response(resp, resp.content) [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1022.809445] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1022.809445] env[59490]: INFO nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Terminating instance [ 1022.812250] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1022.812250] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1022.812250] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f1ed0a07-4de6-4c05-b1ba-2017073768c4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.814453] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1022.815041] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1022.820017] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85e7ae88-2268-45f5-9ba8-086bff9d0207 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.824787] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1022.825237] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ff6b31b2-3e2a-4171-b228-f8664580e611 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.835784] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1022.836109] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1022.837391] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d6e724ec-6dc4-4d75-8bb0-14a6e3fc2278 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.842969] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Waiting for the task: (returnval){ [ 1022.842969] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52ef2f86-ac2e-a895-6b3b-1b2697b19e4c" [ 1022.842969] env[59490]: _type = "Task" [ 1022.842969] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.852471] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52ef2f86-ac2e-a895-6b3b-1b2697b19e4c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.880231] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Successfully updated port: 194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1022.889306] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1022.889306] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1022.889306] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1022.891189] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1022.891438] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1022.891921] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Deleting the datastore file [datastore2] 581848be-38fb-42da-b723-480bf297d1a5 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1022.892664] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2c9c7208-5aa8-46ce-bf5f-87c49cef5bb4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1022.900659] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Waiting for the task: (returnval){ [ 1022.900659] env[59490]: value = "task-707448" [ 1022.900659] env[59490]: _type = "Task" [ 1022.900659] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1022.909733] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Task: {'id': task-707448, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1022.959166] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1023.150449] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Successfully created port: dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1023.215529] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Updating instance_info_cache with network_info: [{"id": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "address": "fa:16:3e:3c:62:8d", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap194fb95e-e7", "ovs_interfaceid": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1023.228460] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1023.228769] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance network_info: |[{"id": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "address": "fa:16:3e:3c:62:8d", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap194fb95e-e7", "ovs_interfaceid": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1023.229149] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3c:62:8d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c2d3bf80-d60a-4b53-a00a-1381de6d4a12', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '194fb95e-e711-4985-b42b-b1ffe7bf588e', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1023.236792] env[59490]: DEBUG oslo.service.loopingcall [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1023.237590] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1023.237823] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-165188bc-a19f-478a-80aa-7d6c63b7067a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.259814] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1023.259814] env[59490]: value = "task-707449" [ 1023.259814] env[59490]: _type = "Task" [ 1023.259814] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.267992] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707449, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.301149] env[59490]: DEBUG oslo_vmware.api [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Task: {'id': task-707446, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06939} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1023.301420] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1023.301592] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1023.301752] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1023.301985] env[59490]: INFO nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Took 1.24 seconds to destroy the instance on the hypervisor. [ 1023.304759] env[59490]: DEBUG nova.compute.claims [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1023.304759] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.304759] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.332644] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.333407] env[59490]: DEBUG nova.compute.utils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance 2f083456-3eb9-4022-86a3-8d39f83c470f could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1023.335431] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1023.335603] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1023.335747] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1023.335940] env[59490]: DEBUG nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1023.336053] env[59490]: DEBUG nova.network.neutron [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1023.352814] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1023.353080] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Creating directory with path [datastore2] vmware_temp/97efed62-0264-4718-835f-a72c85fb5734/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.353331] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fe160433-018c-4570-8c62-d2d71f66a27a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.364775] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Created directory with path [datastore2] vmware_temp/97efed62-0264-4718-835f-a72c85fb5734/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.364990] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Fetch image to [datastore2] vmware_temp/97efed62-0264-4718-835f-a72c85fb5734/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1023.365179] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/97efed62-0264-4718-835f-a72c85fb5734/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1023.365953] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a69c8c26-5a39-4d16-9585-3813d3dec970 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.373501] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea043d85-d844-49b0-84ee-2721c34be73b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.383293] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb503e37-5080-41bb-b3d8-023e45a18604 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.419381] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51c302e3-4e56-4f32-8462-b82be6d9be98 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.428603] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-99d6dbf7-9419-4078-8f6c-709798b7aa7e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.430337] env[59490]: DEBUG oslo_vmware.api [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Task: {'id': task-707448, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067092} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1023.430618] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1023.430796] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1023.430962] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1023.431144] env[59490]: INFO nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1023.433574] env[59490]: DEBUG nova.compute.claims [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1023.433775] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.433984] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.450777] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1023.462899] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.463645] env[59490]: DEBUG nova.compute.utils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance 581848be-38fb-42da-b723-480bf297d1a5 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1023.465182] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1023.465342] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1023.465498] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1023.465718] env[59490]: DEBUG nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1023.465801] env[59490]: DEBUG nova.network.neutron [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1023.501948] env[59490]: DEBUG neutronclient.v2_0.client [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1023.503706] env[59490]: ERROR nova.compute.manager [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] result = getattr(controller, method)(*args, **kwargs) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._get(image_id) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] resp, body = self.http_client.get(url, headers=header) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.request(url, 'GET', **kwargs) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._handle_response(resp) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exc.from_response(resp, resp.content) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] During handling of the above exception, another exception occurred: [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.driver.spawn(context, instance, image_meta, [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._fetch_image_if_missing(context, vi) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] images.fetch_image( [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] metadata = IMAGE_API.get(context, image_ref) [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return session.show(context, image_id, [ 1023.503706] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] _reraise_translated_image_exception(image_id) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise new_exc.with_traceback(exc_trace) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] result = getattr(controller, method)(*args, **kwargs) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._get(image_id) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] resp, body = self.http_client.get(url, headers=header) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.request(url, 'GET', **kwargs) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._handle_response(resp) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exc.from_response(resp, resp.content) [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] During handling of the above exception, another exception occurred: [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._build_and_run_instance(context, instance, image, [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] with excutils.save_and_reraise_exception(): [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.force_reraise() [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise self.value [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] with self.rt.instance_claim(context, instance, node, allocs, [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.abort() [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1023.505316] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return f(*args, **kwargs) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._unset_instance_host_and_node(instance) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] instance.save() [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] updates, result = self.indirection_api.object_action( [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return cctxt.call(context, 'object_action', objinst=objinst, [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] result = self.transport._send( [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._driver.send(target, ctxt, message, [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise result [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] nova.exception_Remote.InstanceNotFound_Remote: Instance 581848be-38fb-42da-b723-480bf297d1a5 could not be found. [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return getattr(target, method)(*args, **kwargs) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return fn(self, *args, **kwargs) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] old_ref, inst_ref = db.instance_update_and_get_original( [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return f(*args, **kwargs) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] with excutils.save_and_reraise_exception() as ectxt: [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.force_reraise() [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise self.value [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return f(*args, **kwargs) [ 1023.507034] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return f(context, *args, **kwargs) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exception.InstanceNotFound(instance_id=uuid) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] nova.exception.InstanceNotFound: Instance 581848be-38fb-42da-b723-480bf297d1a5 could not be found. [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] During handling of the above exception, another exception occurred: [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] exception_handler_v20(status_code, error_body) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise client_exc(message=error_message, [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Neutron server returns request_ids: ['req-ca1145d5-de0a-40f4-8bbd-cc0a370b816c'] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] During handling of the above exception, another exception occurred: [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] Traceback (most recent call last): [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._deallocate_network(context, instance, requested_networks) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self.network_api.deallocate_for_instance( [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] data = neutron.list_ports(**search_opts) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.list('ports', self.ports_path, retrieve_all, [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1023.508669] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] for r in self._pagination(collection, path, **params): [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] res = self.get(path, params=params) [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.retry_request("GET", action, body=body, [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] return self.do_request(method, action, body=body, [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] ret = obj(*args, **kwargs) [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] self._handle_fault_response(status_code, replybody, resp) [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] raise exception.Unauthorized() [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] nova.exception.Unauthorized: Not authorized. [ 1023.509838] env[59490]: ERROR nova.compute.manager [instance: 581848be-38fb-42da-b723-480bf297d1a5] [ 1023.529189] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7a304b77-7366-4706-bcd2-9a9ea1727c79 tempest-ServerAddressesTestJSON-1154682744 tempest-ServerAddressesTestJSON-1154682744-project-member] Lock "581848be-38fb-42da-b723-480bf297d1a5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.055s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.538984] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1023.567592] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1023.567592] env[59490]: ERROR nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] result = getattr(controller, method)(*args, **kwargs) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._get(image_id) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] resp, body = self.http_client.get(url, headers=header) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.request(url, 'GET', **kwargs) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._handle_response(resp) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exc.from_response(resp, resp.content) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] During handling of the above exception, another exception occurred: [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] yield resources [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.driver.spawn(context, instance, image_meta, [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._fetch_image_if_missing(context, vi) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.567592] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] images.fetch_image( [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] metadata = IMAGE_API.get(context, image_ref) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return session.show(context, image_id, [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] _reraise_translated_image_exception(image_id) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise new_exc.with_traceback(exc_trace) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] result = getattr(controller, method)(*args, **kwargs) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._get(image_id) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] resp, body = self.http_client.get(url, headers=header) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.request(url, 'GET', **kwargs) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._handle_response(resp) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exc.from_response(resp, resp.content) [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1023.568963] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1023.568963] env[59490]: INFO nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Terminating instance [ 1023.571115] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1023.571115] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1023.571115] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1023.571115] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1023.571115] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7b6e3c51-72e9-49ef-9ade-880d667c2a77 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.573271] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7b4a1e6-b6a4-4083-8bb5-51778962e358 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.584213] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1023.586781] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-945a0382-2f1c-4d29-8969-798883e45fb3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.588148] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1023.588148] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1023.589171] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-3cae1338-21fa-43d8-93fd-08f7567da904 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.595054] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 1023.595054] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52d77392-cad3-55e1-3769-c1ebb4ea6be8" [ 1023.595054] env[59490]: _type = "Task" [ 1023.595054] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.598665] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.598921] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.600559] env[59490]: INFO nova.compute.claims [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1023.607865] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52d77392-cad3-55e1-3769-c1ebb4ea6be8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.659604] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1023.659845] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1023.660100] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Deleting the datastore file [datastore2] 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1023.660358] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-62c62c27-ab3b-4fba-bded-aa5cf91a777f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.667530] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Waiting for the task: (returnval){ [ 1023.667530] env[59490]: value = "task-707451" [ 1023.667530] env[59490]: _type = "Task" [ 1023.667530] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1023.675922] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Task: {'id': task-707451, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.686173] env[59490]: DEBUG neutronclient.v2_0.client [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1023.687695] env[59490]: ERROR nova.compute.manager [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] result = getattr(controller, method)(*args, **kwargs) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._get(image_id) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] resp, body = self.http_client.get(url, headers=header) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.request(url, 'GET', **kwargs) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._handle_response(resp) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exc.from_response(resp, resp.content) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] During handling of the above exception, another exception occurred: [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.driver.spawn(context, instance, image_meta, [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._fetch_image_if_missing(context, vi) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image_fetch(context, vi, tmp_image_ds_loc) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] images.fetch_image( [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] metadata = IMAGE_API.get(context, image_ref) [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return session.show(context, image_id, [ 1023.687695] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] _reraise_translated_image_exception(image_id) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise new_exc.with_traceback(exc_trace) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] result = getattr(controller, method)(*args, **kwargs) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._get(image_id) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] resp, body = self.http_client.get(url, headers=header) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.request(url, 'GET', **kwargs) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._handle_response(resp) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exc.from_response(resp, resp.content) [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] During handling of the above exception, another exception occurred: [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._build_and_run_instance(context, instance, image, [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] with excutils.save_and_reraise_exception(): [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.force_reraise() [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise self.value [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] with self.rt.instance_claim(context, instance, node, allocs, [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.abort() [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1023.688692] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return f(*args, **kwargs) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._unset_instance_host_and_node(instance) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] instance.save() [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] updates, result = self.indirection_api.object_action( [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] result = self.transport._send( [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._driver.send(target, ctxt, message, [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise result [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] nova.exception_Remote.InstanceNotFound_Remote: Instance 2f083456-3eb9-4022-86a3-8d39f83c470f could not be found. [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return getattr(target, method)(*args, **kwargs) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return fn(self, *args, **kwargs) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return f(*args, **kwargs) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] with excutils.save_and_reraise_exception() as ectxt: [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.force_reraise() [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise self.value [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return f(*args, **kwargs) [ 1023.689728] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return f(context, *args, **kwargs) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exception.InstanceNotFound(instance_id=uuid) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] nova.exception.InstanceNotFound: Instance 2f083456-3eb9-4022-86a3-8d39f83c470f could not be found. [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] During handling of the above exception, another exception occurred: [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] exception_handler_v20(status_code, error_body) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise client_exc(message=error_message, [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Neutron server returns request_ids: ['req-41fb2485-9683-4959-8f3a-2b6fe6f71f7b'] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] During handling of the above exception, another exception occurred: [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Traceback (most recent call last): [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._deallocate_network(context, instance, requested_networks) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self.network_api.deallocate_for_instance( [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] data = neutron.list_ports(**search_opts) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.list('ports', self.ports_path, retrieve_all, [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1023.690783] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] for r in self._pagination(collection, path, **params): [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] res = self.get(path, params=params) [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.retry_request("GET", action, body=body, [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] return self.do_request(method, action, body=body, [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] ret = obj(*args, **kwargs) [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] self._handle_fault_response(status_code, replybody, resp) [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] raise exception.Unauthorized() [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] nova.exception.Unauthorized: Not authorized. [ 1023.692845] env[59490]: ERROR nova.compute.manager [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] [ 1023.739737] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5438ef70-b364-4167-8c91-b3cc45c8d493 tempest-ServerActionsTestJSON-707402538 tempest-ServerActionsTestJSON-707402538-project-member] Lock "2f083456-3eb9-4022-86a3-8d39f83c470f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.899s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.749640] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1023.771795] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707449, 'name': CreateVM_Task} progress is 99%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1023.793767] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1023.824745] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-868c9612-caec-47f1-9774-5f4c74dd9e6d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.832170] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-796a3e69-3adf-49f3-a2b2-5f6108008b3d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.862653] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f8b0e4-bff2-4ab0-be41-6623ff97c539 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.870383] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51403ad3-385f-4e71-bc00-0ad238b40847 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1023.883881] env[59490]: DEBUG nova.compute.provider_tree [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1023.894085] env[59490]: DEBUG nova.scheduler.client.report [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1023.910268] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.311s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1023.910743] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1023.913011] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.119s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1023.914367] env[59490]: INFO nova.compute.claims [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1023.944022] env[59490]: DEBUG nova.compute.utils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1023.944022] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1023.944472] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1023.951313] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1024.026714] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1024.050808] env[59490]: DEBUG nova.policy [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70854147748745dc927f112c021113d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdb5f189084d4decab94abbff41e128b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1024.057759] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1024.057996] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1024.058254] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1024.058334] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1024.058466] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1024.058652] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1024.058829] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1024.058958] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1024.059229] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1024.059394] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1024.059553] env[59490]: DEBUG nova.virt.hardware [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1024.060641] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f3971a15-f7c5-4fd6-ab67-0728f79be750 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.072309] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecdc0171-438b-48c0-8291-de63564aff6a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.109620] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1024.109620] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Creating directory with path [datastore2] vmware_temp/39a9bacb-46e5-48d4-b63b-7d1cd30a4990/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.111212] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-982b7e9f-fcb6-4637-a604-7cbd3047bc5e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.122470] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Created directory with path [datastore2] vmware_temp/39a9bacb-46e5-48d4-b63b-7d1cd30a4990/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.122665] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Fetch image to [datastore2] vmware_temp/39a9bacb-46e5-48d4-b63b-7d1cd30a4990/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1024.122825] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/39a9bacb-46e5-48d4-b63b-7d1cd30a4990/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1024.123568] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81a596c7-cdc3-42e3-b3b3-371da8d7c757 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.131914] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2d6b929f-134f-4559-b3c2-5545404e24f4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.141863] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Successfully updated port: dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1024.144748] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69fd1d05-7bf1-4820-8d99-218c98464125 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.153459] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.153646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.153760] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1024.186654] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25ea37d-44a9-4b73-bf8b-113533461210 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.189755] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60d69296-a8a8-4c17-8a76-8282523ef0f7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.200891] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f17b636-3244-45a7-9db0-4f23d7600ec0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.204208] env[59490]: DEBUG oslo_vmware.api [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Task: {'id': task-707451, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069122} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1024.204710] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1024.204925] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1024.205136] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1024.205338] env[59490]: INFO nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1024.207241] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-edcc186d-3e9b-4233-9613-f766cf90f8d6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.209409] env[59490]: DEBUG nova.compute.claims [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1024.209614] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.239224] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85910b23-06e6-40dd-982d-09c5f4ad7d5e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.241919] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1024.248391] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6ba8701-6c53-495d-ae69-7637f086d76a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.261505] env[59490]: DEBUG nova.compute.provider_tree [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1024.263235] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1024.272595] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707449, 'name': CreateVM_Task} progress is 99%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.273779] env[59490]: DEBUG nova.scheduler.client.report [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1024.302166] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.389s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.302642] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1024.309016] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.096s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.335385] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.029s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.339105] env[59490]: DEBUG nova.compute.utils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1024.340291] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1024.340480] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1024.340683] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1024.340863] env[59490]: DEBUG nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1024.341031] env[59490]: DEBUG nova.network.neutron [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1024.344741] env[59490]: DEBUG nova.compute.utils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1024.346038] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1024.346165] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1024.357526] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1024.365406] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.366234] env[59490]: ERROR nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Traceback (most recent call last): [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] result = getattr(controller, method)(*args, **kwargs) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self._get(image_id) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] resp, body = self.http_client.get(url, headers=header) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self.request(url, 'GET', **kwargs) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self._handle_response(resp) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] raise exc.from_response(resp, resp.content) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] During handling of the above exception, another exception occurred: [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Traceback (most recent call last): [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] yield resources [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] self.driver.spawn(context, instance, image_meta, [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] self._fetch_image_if_missing(context, vi) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] image_fetch(context, vi, tmp_image_ds_loc) [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] images.fetch_image( [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1024.366234] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] metadata = IMAGE_API.get(context, image_ref) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return session.show(context, image_id, [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] _reraise_translated_image_exception(image_id) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] raise new_exc.with_traceback(exc_trace) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] result = getattr(controller, method)(*args, **kwargs) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self._get(image_id) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] resp, body = self.http_client.get(url, headers=header) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self.request(url, 'GET', **kwargs) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] return self._handle_response(resp) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] raise exc.from_response(resp, resp.content) [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1024.367782] env[59490]: ERROR nova.compute.manager [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] [ 1024.367782] env[59490]: INFO nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Terminating instance [ 1024.369052] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.369052] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.369052] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.369052] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.369515] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1024.370359] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a94124b4-7d5f-4465-a13c-6c6957b1a917 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.378130] env[59490]: DEBUG nova.compute.utils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Can not refresh info_cache because instance was not found {{(pid=59490) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1024.380292] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.380462] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1024.381208] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8fff6a06-3874-4e13-8788-628d3349c42a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.386894] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1024.386894] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]527e2d72-e1f1-a660-5abf-2d2846279e8d" [ 1024.386894] env[59490]: _type = "Task" [ 1024.386894] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.397631] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]527e2d72-e1f1-a660-5abf-2d2846279e8d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.427320] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1024.446980] env[59490]: DEBUG neutronclient.v2_0.client [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1024.448525] env[59490]: ERROR nova.compute.manager [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] result = getattr(controller, method)(*args, **kwargs) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._get(image_id) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] resp, body = self.http_client.get(url, headers=header) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.request(url, 'GET', **kwargs) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._handle_response(resp) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exc.from_response(resp, resp.content) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] During handling of the above exception, another exception occurred: [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.driver.spawn(context, instance, image_meta, [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._fetch_image_if_missing(context, vi) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image_fetch(context, vi, tmp_image_ds_loc) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] images.fetch_image( [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] metadata = IMAGE_API.get(context, image_ref) [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return session.show(context, image_id, [ 1024.448525] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] _reraise_translated_image_exception(image_id) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise new_exc.with_traceback(exc_trace) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] result = getattr(controller, method)(*args, **kwargs) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._get(image_id) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] resp, body = self.http_client.get(url, headers=header) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.request(url, 'GET', **kwargs) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._handle_response(resp) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exc.from_response(resp, resp.content) [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] During handling of the above exception, another exception occurred: [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._build_and_run_instance(context, instance, image, [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] with excutils.save_and_reraise_exception(): [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.force_reraise() [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise self.value [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] with self.rt.instance_claim(context, instance, node, allocs, [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.abort() [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1024.452784] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return f(*args, **kwargs) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._unset_instance_host_and_node(instance) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] instance.save() [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] updates, result = self.indirection_api.object_action( [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return cctxt.call(context, 'object_action', objinst=objinst, [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] result = self.transport._send( [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._driver.send(target, ctxt, message, [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise result [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] nova.exception_Remote.InstanceNotFound_Remote: Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 could not be found. [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return getattr(target, method)(*args, **kwargs) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return fn(self, *args, **kwargs) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] old_ref, inst_ref = db.instance_update_and_get_original( [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return f(*args, **kwargs) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] with excutils.save_and_reraise_exception() as ectxt: [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.force_reraise() [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise self.value [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return f(*args, **kwargs) [ 1024.453730] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return f(context, *args, **kwargs) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exception.InstanceNotFound(instance_id=uuid) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] nova.exception.InstanceNotFound: Instance 1c7b3da9-32ab-4aa0-90e3-f27bf5996590 could not be found. [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] During handling of the above exception, another exception occurred: [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] exception_handler_v20(status_code, error_body) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise client_exc(message=error_message, [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Neutron server returns request_ids: ['req-c7900616-abd6-4ee5-ae3b-845dd185d98e'] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] During handling of the above exception, another exception occurred: [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Traceback (most recent call last): [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._deallocate_network(context, instance, requested_networks) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self.network_api.deallocate_for_instance( [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] data = neutron.list_ports(**search_opts) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.list('ports', self.ports_path, retrieve_all, [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1024.454749] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] for r in self._pagination(collection, path, **params): [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] res = self.get(path, params=params) [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.retry_request("GET", action, body=body, [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] return self.do_request(method, action, body=body, [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] ret = obj(*args, **kwargs) [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] self._handle_fault_response(status_code, replybody, resp) [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] raise exception.Unauthorized() [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] nova.exception.Unauthorized: Not authorized. [ 1024.456638] env[59490]: ERROR nova.compute.manager [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1024.456638] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1024.457607] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1024.457607] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1024.457607] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1024.457607] env[59490]: DEBUG nova.virt.hardware [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1024.457607] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-691a1b54-7c78-4011-b04d-615d5f7cfb23 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.461421] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30c6d86c-4a9c-40be-aeb0-87bf21dcf4f3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.477644] env[59490]: DEBUG oslo_concurrency.lockutils [None req-fedc5414-c99e-4123-a7e4-7df516fc2282 tempest-ImagesOneServerNegativeTestJSON-1143204754 tempest-ImagesOneServerNegativeTestJSON-1143204754-project-member] Lock "1c7b3da9-32ab-4aa0-90e3-f27bf5996590" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.345s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.488035] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1024.495152] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1024.535581] env[59490]: DEBUG nova.policy [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3effda0799d54919a75ac276c79e5781', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '32f4727ed1224b1dbb45024ec3e49363', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1024.546462] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1024.546703] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1024.548138] env[59490]: INFO nova.compute.claims [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1024.700703] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1024.713459] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1024.713790] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1024.713971] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1024.715160] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-966acda9-9372-4e57-a468-1970fd8ca4c5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.725634] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1024.725874] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-74143b5f-dde4-41c1-94af-6142d65b54c5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.751526] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1024.751787] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1024.751980] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Deleting the datastore file [datastore2] 3464c5af-60a4-4b6d-b7ca-51cf7312cf09 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1024.752236] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-4a56866a-c567-4672-b47a-2c16a792a53c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.757972] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for the task: (returnval){ [ 1024.757972] env[59490]: value = "task-707453" [ 1024.757972] env[59490]: _type = "Task" [ 1024.757972] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.780687] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707449, 'name': CreateVM_Task, 'duration_secs': 1.508251} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1024.780973] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': task-707453, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.783510] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1024.788780] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.788942] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.789293] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1024.790506] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cdf5980a-92e2-46fa-a5ee-a64a09c928ff {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.793575] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Received event network-changed-194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1024.793793] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Refreshing instance network info cache due to event network-changed-194fb95e-e711-4985-b42b-b1ffe7bf588e. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1024.793997] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Acquiring lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1024.794149] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Acquired lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1024.794355] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Refreshing network info cache for port 194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1024.801900] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1024.801900] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52d273e0-a7ce-3137-4d44-8a1d33cc1dcd" [ 1024.801900] env[59490]: _type = "Task" [ 1024.801900] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1024.812240] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52d273e0-a7ce-3137-4d44-8a1d33cc1dcd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1024.814744] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c22f973-c7e7-485e-9c58-64196d23a39a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.821369] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b5a08a09-b86c-4418-b8d6-e27c64d6cdcf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.852153] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45379ae7-8a9a-42e7-8dd8-d3922ab048f8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.859497] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-633365a8-76e6-480d-8281-b3d873719d78 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.872542] env[59490]: DEBUG nova.compute.provider_tree [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1024.881264] env[59490]: DEBUG nova.scheduler.client.report [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1024.896680] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1024.896952] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1024.897236] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-10c6f213-29dd-44d7-9023-386f813b3d71 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.901493] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.355s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1024.901910] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1024.908851] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1024.909932] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Fetch image to [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1024.909932] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1024.910122] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-776faa01-744e-4097-982b-70d57044d729 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.917083] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7a3ff26-2df3-4faa-999a-5f69f6a4e646 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.927237] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41c2ae4e-e348-42eb-b73a-ef2a63e183ef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.957659] env[59490]: DEBUG nova.compute.utils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1024.959531] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44d1eed0-6fe5-4902-b667-768ef9f5e28f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.964422] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1024.964422] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1024.968137] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-9cbcc195-c3cc-4094-8da4-3372ed8bbaa9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1024.974015] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1024.991477] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1025.050462] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1025.061706] env[59490]: DEBUG nova.policy [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70854147748745dc927f112c021113d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdb5f189084d4decab94abbff41e128b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1025.073022] env[59490]: DEBUG oslo_vmware.rw_handles [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1025.128059] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Updating instance_info_cache with network_info: [{"id": "dd75bf58-eeff-451a-a180-f0480a97597f", "address": "fa:16:3e:e8:ec:f4", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd75bf58-ee", "ovs_interfaceid": "dd75bf58-eeff-451a-a180-f0480a97597f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.129787] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Successfully created port: 5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1025.134642] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1025.134868] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1025.135021] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1025.135199] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1025.135340] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1025.135516] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1025.135672] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1025.135824] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1025.135983] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1025.136152] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1025.136317] env[59490]: DEBUG nova.virt.hardware [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1025.137986] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a2d232b-0151-44c2-be2f-7d440621a7d5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.140733] env[59490]: DEBUG oslo_vmware.rw_handles [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1025.140887] env[59490]: DEBUG oslo_vmware.rw_handles [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1025.143511] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.143511] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance network_info: |[{"id": "dd75bf58-eeff-451a-a180-f0480a97597f", "address": "fa:16:3e:e8:ec:f4", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd75bf58-ee", "ovs_interfaceid": "dd75bf58-eeff-451a-a180-f0480a97597f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1025.148021] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e8:ec:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c2d3bf80-d60a-4b53-a00a-1381de6d4a12', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'dd75bf58-eeff-451a-a180-f0480a97597f', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1025.153551] env[59490]: DEBUG oslo.service.loopingcall [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1025.154160] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1025.156014] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd7e7e96-1038-4432-8c3c-32cac61a12e7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.160043] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-3aa53b3c-0dd0-4bc0-b25d-98dff8aa0bd0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.186201] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1025.186201] env[59490]: value = "task-707454" [ 1025.186201] env[59490]: _type = "Task" [ 1025.186201] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.194084] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707454, 'name': CreateVM_Task} progress is 6%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.268629] env[59490]: DEBUG oslo_vmware.api [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Task: {'id': task-707453, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041578} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1025.268929] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1025.269160] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1025.269370] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1025.269573] env[59490]: INFO nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Took 0.56 seconds to destroy the instance on the hypervisor. [ 1025.269817] env[59490]: DEBUG oslo.service.loopingcall [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1025.270060] env[59490]: DEBUG nova.compute.manager [-] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1025.272348] env[59490]: DEBUG nova.compute.claims [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1025.272532] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1025.272773] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1025.299225] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.299859] env[59490]: DEBUG nova.compute.utils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance 3464c5af-60a4-4b6d-b7ca-51cf7312cf09 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1025.301273] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1025.301432] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1025.301721] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquiring lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.301860] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Acquired lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.302031] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1025.312823] env[59490]: DEBUG nova.compute.utils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Can not refresh info_cache because instance was not found {{(pid=59490) refresh_info_cache_for_instance /opt/stack/nova/nova/compute/utils.py:1010}} [ 1025.315071] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.315289] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1025.315490] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.385127] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1025.387118] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Successfully created port: 4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1025.595532] env[59490]: DEBUG nova.network.neutron [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.605840] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Releasing lock "refresh_cache-3464c5af-60a4-4b6d-b7ca-51cf7312cf09" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.606037] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1025.606219] env[59490]: DEBUG nova.compute.manager [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Skipping network deallocation for instance since networking was not requested. {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2255}} [ 1025.665433] env[59490]: DEBUG oslo_concurrency.lockutils [None req-7f2c026b-71ea-4713-a6ab-bba8645c9159 tempest-ServerShowV247Test-818522866 tempest-ServerShowV247Test-818522866-project-member] Lock "3464c5af-60a4-4b6d-b7ca-51cf7312cf09" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.514s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.681224] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1025.699287] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707454, 'name': CreateVM_Task, 'duration_secs': 0.280706} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1025.699287] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1025.699389] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.699516] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.699816] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1025.700055] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9f2540df-2052-4597-bc7d-864c5a0b88f2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.707580] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1025.707580] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]525e4a9d-811f-74fb-a219-1308a598c354" [ 1025.707580] env[59490]: _type = "Task" [ 1025.707580] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1025.714777] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]525e4a9d-811f-74fb-a219-1308a598c354, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1025.732164] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Successfully created port: 0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1025.738092] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1025.738334] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1025.739891] env[59490]: INFO nova.compute.claims [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1025.779167] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Updated VIF entry in instance network info cache for port 194fb95e-e711-4985-b42b-b1ffe7bf588e. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1025.779276] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Updating instance_info_cache with network_info: [{"id": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "address": "fa:16:3e:3c:62:8d", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap194fb95e-e7", "ovs_interfaceid": "194fb95e-e711-4985-b42b-b1ffe7bf588e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1025.797039] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Releasing lock "refresh_cache-ddbac2db-c555-4554-aa21-7303c8e36371" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1025.798027] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Received event network-vif-plugged-dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1025.798027] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Acquiring lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1025.798027] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1025.798027] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.798027] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] No waiting events found dispatching network-vif-plugged-dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1025.798300] env[59490]: WARNING nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Received unexpected event network-vif-plugged-dd75bf58-eeff-451a-a180-f0480a97597f for instance with vm_state building and task_state spawning. [ 1025.798300] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Received event network-changed-dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1025.799536] env[59490]: DEBUG nova.compute.manager [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Refreshing instance network info cache due to event network-changed-dd75bf58-eeff-451a-a180-f0480a97597f. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1025.799536] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Acquiring lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1025.799536] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Acquired lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1025.799536] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Refreshing network info cache for port dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1025.932972] env[59490]: DEBUG nova.compute.manager [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Received event network-vif-plugged-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1025.933219] env[59490]: DEBUG oslo_concurrency.lockutils [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] Acquiring lock "f4bbfad2-f118-4292-bb36-4229c333dd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1025.933424] env[59490]: DEBUG oslo_concurrency.lockutils [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] Lock "f4bbfad2-f118-4292-bb36-4229c333dd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1025.933586] env[59490]: DEBUG oslo_concurrency.lockutils [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] Lock "f4bbfad2-f118-4292-bb36-4229c333dd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1025.933958] env[59490]: DEBUG nova.compute.manager [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] No waiting events found dispatching network-vif-plugged-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1025.933958] env[59490]: WARNING nova.compute.manager [req-029f75a5-5cf7-4966-a778-4709ed96492b req-08f116ff-4cf7-43e3-b6fb-a5d7a1b1a8fc service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Received unexpected event network-vif-plugged-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb for instance with vm_state building and task_state spawning. [ 1025.976952] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0748313-600f-4b0f-a0b8-7a1af1f4f53e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1025.986563] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b623ed1e-54f6-40b6-a9b3-2d3827abe67e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.018199] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-faccf799-58d6-4f6a-ac28-c28a3bd8e7b4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.025930] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-588fc7a8-ad0d-45e7-8240-716c74e2e7dd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.039831] env[59490]: DEBUG nova.compute.provider_tree [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1026.044021] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Successfully updated port: 4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1026.047828] env[59490]: DEBUG nova.scheduler.client.report [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1026.057362] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.057607] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.057673] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1026.061685] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.323s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.062120] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1026.097020] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1026.099791] env[59490]: DEBUG nova.compute.utils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1026.103333] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1026.103505] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1026.113539] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1026.191970] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1026.215396] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1026.215627] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1026.215784] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1026.215961] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1026.216114] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1026.216255] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1026.216571] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1026.216835] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1026.217035] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1026.217264] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1026.217469] env[59490]: DEBUG nova.virt.hardware [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1026.218301] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff9e8452-65dd-41b4-8c97-6ad6ca3f1d81 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.225108] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.225358] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1026.225577] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.230292] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8b48ee9-7819-49fc-b8ba-c9784b802756 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.307524] env[59490]: DEBUG nova.policy [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c69b5e5d275141acacbf05d53fca7120', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'a743b565aa6143aba5f999cc3cc93483', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1026.324938] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Updating instance_info_cache with network_info: [{"id": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "address": "fa:16:3e:69:a9:f4", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4886220c-5f", "ovs_interfaceid": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1026.341787] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.342102] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance network_info: |[{"id": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "address": "fa:16:3e:69:a9:f4", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4886220c-5f", "ovs_interfaceid": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1026.342491] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:69:a9:f4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '4886220c-5ffc-4c5b-b1f1-1f48293d8bcb', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1026.350269] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating folder: Project (cdb5f189084d4decab94abbff41e128b). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1026.350941] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-02bdcccd-6300-4aff-81f6-72a81317e6d5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.361918] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created folder: Project (cdb5f189084d4decab94abbff41e128b) in parent group-v168905. [ 1026.362342] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating folder: Instances. Parent ref: group-v168966. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1026.362418] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-eee17054-86a0-4e60-9de8-a64c61dbdaae {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.372288] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created folder: Instances in parent group-v168966. [ 1026.372518] env[59490]: DEBUG oslo.service.loopingcall [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1026.372696] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1026.372899] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-50cb88ee-9809-4396-b544-371cc00c07e7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.392880] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1026.392880] env[59490]: value = "task-707457" [ 1026.392880] env[59490]: _type = "Task" [ 1026.392880] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.404342] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707457, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1026.767676] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Updated VIF entry in instance network info cache for port dd75bf58-eeff-451a-a180-f0480a97597f. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1026.768040] env[59490]: DEBUG nova.network.neutron [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Updating instance_info_cache with network_info: [{"id": "dd75bf58-eeff-451a-a180-f0480a97597f", "address": "fa:16:3e:e8:ec:f4", "network": {"id": "a6de70dd-79a2-4399-be40-94a0840cfae3", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-1524533379-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5530a0bb6d434878aed7b9c96009b416", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c2d3bf80-d60a-4b53-a00a-1381de6d4a12", "external-id": "nsx-vlan-transportzone-982", "segmentation_id": 982, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapdd75bf58-ee", "ovs_interfaceid": "dd75bf58-eeff-451a-a180-f0480a97597f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1026.780245] env[59490]: DEBUG oslo_concurrency.lockutils [req-a3364812-ef99-44ff-81c4-c95b99df556f req-24e66ae3-b42d-4261-be25-3581c08c1aa3 service nova] Releasing lock "refresh_cache-d9c5b959-e509-4d1b-8a0b-de2c58a7626f" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1026.818493] env[59490]: DEBUG nova.compute.manager [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Received event network-vif-plugged-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1026.818714] env[59490]: DEBUG oslo_concurrency.lockutils [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] Acquiring lock "e879cc90-f290-42cd-9059-46f42284a32c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1026.818958] env[59490]: DEBUG oslo_concurrency.lockutils [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] Lock "e879cc90-f290-42cd-9059-46f42284a32c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1026.820797] env[59490]: DEBUG oslo_concurrency.lockutils [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] Lock "e879cc90-f290-42cd-9059-46f42284a32c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1026.821050] env[59490]: DEBUG nova.compute.manager [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] No waiting events found dispatching network-vif-plugged-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1026.821308] env[59490]: WARNING nova.compute.manager [req-a32dd6ff-e125-405d-ad7e-e992ff5576d8 req-30639a95-84c9-4a14-936b-63d9aec1a002 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Received unexpected event network-vif-plugged-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc for instance with vm_state building and task_state spawning. [ 1026.903522] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707457, 'name': CreateVM_Task, 'duration_secs': 0.314213} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1026.903680] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1026.904673] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1026.904811] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1026.905139] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1026.905378] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-cdacad0f-a806-43e6-b7b5-c1203174647c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1026.909902] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1026.909902] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5288dd8c-954d-c241-f509-e0f0ff129b69" [ 1026.909902] env[59490]: _type = "Task" [ 1026.909902] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1026.918820] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5288dd8c-954d-c241-f509-e0f0ff129b69, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.000327] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Successfully updated port: 5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1027.013859] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.013938] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.014048] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1027.118836] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1027.424685] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.424936] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1027.425169] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.425974] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Successfully created port: ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1027.540891] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Updating instance_info_cache with network_info: [{"id": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "address": "fa:16:3e:b9:f3:ba", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe5903c-68", "ovs_interfaceid": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.553836] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.554257] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance network_info: |[{"id": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "address": "fa:16:3e:b9:f3:ba", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe5903c-68", "ovs_interfaceid": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1027.554557] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b9:f3:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1027.564158] env[59490]: DEBUG oslo.service.loopingcall [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1027.564666] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1027.564902] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a0188cf8-56f7-483a-830b-e74ecf7ddc4f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1027.584902] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Successfully updated port: 0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1027.593068] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1027.593068] env[59490]: value = "task-707458" [ 1027.593068] env[59490]: _type = "Task" [ 1027.593068] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1027.601959] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707458, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1027.602914] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.603054] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquired lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.603195] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1027.659779] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1027.958749] env[59490]: DEBUG nova.compute.manager [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Received event network-changed-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1027.959239] env[59490]: DEBUG nova.compute.manager [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Refreshing instance network info cache due to event network-changed-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1027.959288] env[59490]: DEBUG oslo_concurrency.lockutils [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] Acquiring lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1027.959438] env[59490]: DEBUG oslo_concurrency.lockutils [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] Acquired lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1027.959595] env[59490]: DEBUG nova.network.neutron [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Refreshing network info cache for port 4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1027.971394] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Updating instance_info_cache with network_info: [{"id": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "address": "fa:16:3e:34:7f:ff", "network": {"id": "25944d03-c808-40b4-9d9e-7c14e343b030", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-917318635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32f4727ed1224b1dbb45024ec3e49363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d2742ba-c3af-4412-877d-c2811dfeba46", "external-id": "nsx-vlan-transportzone-390", "segmentation_id": 390, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c2a348c-67", "ovs_interfaceid": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1027.983275] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Releasing lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1027.983520] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance network_info: |[{"id": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "address": "fa:16:3e:34:7f:ff", "network": {"id": "25944d03-c808-40b4-9d9e-7c14e343b030", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-917318635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32f4727ed1224b1dbb45024ec3e49363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d2742ba-c3af-4412-877d-c2811dfeba46", "external-id": "nsx-vlan-transportzone-390", "segmentation_id": 390, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c2a348c-67", "ovs_interfaceid": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1027.983821] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:34:7f:ff', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d2742ba-c3af-4412-877d-c2811dfeba46', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0c2a348c-67d7-4b37-abf4-bd02b3180cbd', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1027.991854] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Creating folder: Project (32f4727ed1224b1dbb45024ec3e49363). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1027.992345] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f17ca935-5ed5-4bad-b83e-39b383c64fdd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.002843] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Created folder: Project (32f4727ed1224b1dbb45024ec3e49363) in parent group-v168905. [ 1028.003034] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Creating folder: Instances. Parent ref: group-v168970. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1028.003246] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0ce63147-36d0-4a3b-9695-d2bc5bc7f363 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.011446] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Created folder: Instances in parent group-v168970. [ 1028.011658] env[59490]: DEBUG oslo.service.loopingcall [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1028.011824] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1028.012008] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c7aa064e-8bcd-4ce3-88c8-f59d2a7f69be {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.030317] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1028.030317] env[59490]: value = "task-707461" [ 1028.030317] env[59490]: _type = "Task" [ 1028.030317] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.037889] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707461, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.103923] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707458, 'name': CreateVM_Task, 'duration_secs': 0.281526} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1028.104103] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1028.104808] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.104953] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.107018] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1028.107018] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5ccb8794-8167-42e1-af41-c243ace889f9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.110116] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1028.110116] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52c4af69-6611-c1ef-cf1d-cf380befa6cf" [ 1028.110116] env[59490]: _type = "Task" [ 1028.110116] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.117899] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52c4af69-6611-c1ef-cf1d-cf380befa6cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.540280] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707461, 'name': CreateVM_Task, 'duration_secs': 0.323428} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1028.540482] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1028.541308] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.577256] env[59490]: DEBUG nova.network.neutron [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Updated VIF entry in instance network info cache for port 4886220c-5ffc-4c5b-b1f1-1f48293d8bcb. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1028.577610] env[59490]: DEBUG nova.network.neutron [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Updating instance_info_cache with network_info: [{"id": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "address": "fa:16:3e:69:a9:f4", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap4886220c-5f", "ovs_interfaceid": "4886220c-5ffc-4c5b-b1f1-1f48293d8bcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.587224] env[59490]: DEBUG oslo_concurrency.lockutils [req-8daee4d5-b26f-4bdb-937d-a8733f4bf044 req-05fa0968-c213-45b2-8c0d-8cd05adc978e service nova] Releasing lock "refresh_cache-f4bbfad2-f118-4292-bb36-4229c333dd4c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.619734] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.619961] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1028.620187] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.620622] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.620908] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1028.621160] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b7bd7137-ea40-43e4-bf3d-1d099590b950 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.625843] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Waiting for the task: (returnval){ [ 1028.625843] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52d6b99d-73d8-adbb-f923-bc4a5edcd596" [ 1028.625843] env[59490]: _type = "Task" [ 1028.625843] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.634905] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52d6b99d-73d8-adbb-f923-bc4a5edcd596, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1028.635695] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Successfully updated port: ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1028.645788] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.645913] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquired lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.646068] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1028.682475] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1028.843978] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Updating instance_info_cache with network_info: [{"id": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "address": "fa:16:3e:f7:6a:47", "network": {"id": "1012b177-cb65-4ec3-a423-3bec20affa64", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-706734945-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a743b565aa6143aba5f999cc3cc93483", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "245efab9-c420-438e-a0b8-906357ef62c1", "external-id": "nsx-vlan-transportzone-959", "segmentation_id": 959, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce8b46a6-fc", "ovs_interfaceid": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1028.850045] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Received event network-changed-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1028.850233] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Refreshing instance network info cache due to event network-changed-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1028.850433] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquiring lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1028.850598] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquired lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1028.850709] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Refreshing network info cache for port 5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1028.856234] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Releasing lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1028.856532] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance network_info: |[{"id": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "address": "fa:16:3e:f7:6a:47", "network": {"id": "1012b177-cb65-4ec3-a423-3bec20affa64", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-706734945-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a743b565aa6143aba5f999cc3cc93483", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "245efab9-c420-438e-a0b8-906357ef62c1", "external-id": "nsx-vlan-transportzone-959", "segmentation_id": 959, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce8b46a6-fc", "ovs_interfaceid": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1028.856866] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:6a:47', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '245efab9-c420-438e-a0b8-906357ef62c1', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ce8b46a6-fc65-4276-8335-90b7ad600f33', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1028.864413] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Creating folder: Project (a743b565aa6143aba5f999cc3cc93483). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1028.865534] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8ce47153-277c-4d42-80c6-6903b6fe062f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.880905] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Created folder: Project (a743b565aa6143aba5f999cc3cc93483) in parent group-v168905. [ 1028.880905] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Creating folder: Instances. Parent ref: group-v168973. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1028.880905] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4bbb9407-9714-4afc-8f7b-9fd71c9724a5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.891155] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Created folder: Instances in parent group-v168973. [ 1028.891373] env[59490]: DEBUG oslo.service.loopingcall [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1028.893417] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1028.894181] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f4cdf2e5-f8a3-4f35-a5c3-f1326c19d1c0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1028.912359] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1028.912359] env[59490]: value = "task-707464" [ 1028.912359] env[59490]: _type = "Task" [ 1028.912359] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1028.921685] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707464, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1029.117026] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Updated VIF entry in instance network info cache for port 5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1029.117388] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Updating instance_info_cache with network_info: [{"id": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "address": "fa:16:3e:b9:f3:ba", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5fe5903c-68", "ovs_interfaceid": "5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.126808] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Releasing lock "refresh_cache-e879cc90-f290-42cd-9059-46f42284a32c" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.127056] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Received event network-vif-plugged-0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1029.127241] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquiring lock "f6d58f5a-f432-47a2-af63-033ae4c3d414-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.127438] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Lock "f6d58f5a-f432-47a2-af63-033ae4c3d414-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.127602] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Lock "f6d58f5a-f432-47a2-af63-033ae4c3d414-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.127755] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] No waiting events found dispatching network-vif-plugged-0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1029.127908] env[59490]: WARNING nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Received unexpected event network-vif-plugged-0c2a348c-67d7-4b37-abf4-bd02b3180cbd for instance with vm_state building and task_state spawning. [ 1029.128076] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Received event network-changed-0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1029.128229] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Refreshing instance network info cache due to event network-changed-0c2a348c-67d7-4b37-abf4-bd02b3180cbd. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1029.128405] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquiring lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.128536] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquired lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.128686] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Refreshing network info cache for port 0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1029.141336] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.141542] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1029.141740] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.355787] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Updated VIF entry in instance network info cache for port 0c2a348c-67d7-4b37-abf4-bd02b3180cbd. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1029.355787] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Updating instance_info_cache with network_info: [{"id": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "address": "fa:16:3e:34:7f:ff", "network": {"id": "25944d03-c808-40b4-9d9e-7c14e343b030", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-917318635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "32f4727ed1224b1dbb45024ec3e49363", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d2742ba-c3af-4412-877d-c2811dfeba46", "external-id": "nsx-vlan-transportzone-390", "segmentation_id": 390, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c2a348c-67", "ovs_interfaceid": "0c2a348c-67d7-4b37-abf4-bd02b3180cbd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.368476] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Releasing lock "refresh_cache-f6d58f5a-f432-47a2-af63-033ae4c3d414" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.368683] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Received event network-vif-plugged-ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1029.368905] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquiring lock "014bca6d-9df7-4245-90b4-3f291262292a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1029.369119] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Lock "014bca6d-9df7-4245-90b4-3f291262292a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1029.369277] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Lock "014bca6d-9df7-4245-90b4-3f291262292a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1029.369436] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] No waiting events found dispatching network-vif-plugged-ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1029.369601] env[59490]: WARNING nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Received unexpected event network-vif-plugged-ce8b46a6-fc65-4276-8335-90b7ad600f33 for instance with vm_state building and task_state spawning. [ 1029.369760] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Received event network-changed-ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1029.369908] env[59490]: DEBUG nova.compute.manager [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Refreshing instance network info cache due to event network-changed-ce8b46a6-fc65-4276-8335-90b7ad600f33. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1029.370094] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquiring lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.370228] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Acquired lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.370372] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Refreshing network info cache for port ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1029.425725] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707464, 'name': CreateVM_Task, 'duration_secs': 0.282101} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1029.425725] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1029.426128] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.426281] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1029.426607] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1029.426841] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f827e29-4983-4e61-bbf5-1c580da55994 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1029.431269] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Waiting for the task: (returnval){ [ 1029.431269] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5229b2ca-167d-952c-d425-f5e1406a7b1d" [ 1029.431269] env[59490]: _type = "Task" [ 1029.431269] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1029.445224] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1029.445441] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1029.445634] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1029.684837] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Updated VIF entry in instance network info cache for port ce8b46a6-fc65-4276-8335-90b7ad600f33. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1029.685196] env[59490]: DEBUG nova.network.neutron [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Updating instance_info_cache with network_info: [{"id": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "address": "fa:16:3e:f7:6a:47", "network": {"id": "1012b177-cb65-4ec3-a423-3bec20affa64", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-706734945-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "a743b565aa6143aba5f999cc3cc93483", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "245efab9-c420-438e-a0b8-906357ef62c1", "external-id": "nsx-vlan-transportzone-959", "segmentation_id": 959, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapce8b46a6-fc", "ovs_interfaceid": "ce8b46a6-fc65-4276-8335-90b7ad600f33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1029.694561] env[59490]: DEBUG oslo_concurrency.lockutils [req-feef18c1-08f9-4068-9105-762b293e46de req-88d78546-8550-4551-9970-7a31fcf18fa7 service nova] Releasing lock "refresh_cache-014bca6d-9df7-4245-90b4-3f291262292a" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1067.383930] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1067.384214] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1068.384952] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1070.379829] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1070.383470] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1070.383658] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1070.393312] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1070.393509] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1070.393662] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1070.393810] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1070.396411] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a20fd757-d6fd-4d16-8ed6-84a13664878d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.404766] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d0e1b8a-2c73-4b9e-9949-cb46dd4d2d13 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.418390] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff410860-fb19-4e92-aba6-d2d79d949f56 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.424315] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3a762f8-fc42-4896-8a47-27c9d4e1807b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.452968] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181641MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1070.453154] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1070.453323] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1070.509837] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.509993] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510129] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ddbac2db-c555-4554-aa21-7303c8e36371 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510247] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510360] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e879cc90-f290-42cd-9059-46f42284a32c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510472] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510584] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance f4bbfad2-f118-4292-bb36-4229c333dd4c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.510693] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 014bca6d-9df7-4245-90b4-3f291262292a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1070.520685] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance d0673be9-d670-4d3f-aefa-26f4e336a695 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1070.530217] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance ecb7312c-80f0-490e-8357-7138680d0f90 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1070.539343] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance e24d5bbc-6168-4523-9a0c-cd29c14c9e56 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1070.548053] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 643bfd74-592a-452c-af62-ded4c23009f9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1692}} [ 1070.548260] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1070.548404] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1070.679782] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f00bd710-d790-488b-9c32-3f51044ef486 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.687178] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf04b9a2-fc7e-4eb1-9fbb-1ebc25051827 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.716979] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-785a1e95-81cc-4ba6-89f6-a9557e5dc7ad {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.723756] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-838dcc0b-b0b5-484b-b7a1-1f9179627909 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.736340] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1070.745106] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1070.757455] env[59490]: WARNING oslo_vmware.rw_handles [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1070.757455] env[59490]: ERROR oslo_vmware.rw_handles [ 1070.757990] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1070.759677] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1070.759754] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Copying Virtual Disk [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/a989fff8-f237-4407-a534-32d1b9685cb1/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1070.759998] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-48334506-cbaf-4be3-9748-41a68b9caf84 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1070.763124] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1070.763291] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.310s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1070.769126] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1070.769126] env[59490]: value = "task-707465" [ 1070.769126] env[59490]: _type = "Task" [ 1070.769126] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1070.777377] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707465, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1071.279763] env[59490]: DEBUG oslo_vmware.exceptions [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1071.279992] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1071.280539] env[59490]: ERROR nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1071.280539] env[59490]: Faults: ['InvalidArgument'] [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Traceback (most recent call last): [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] yield resources [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] self.driver.spawn(context, instance, image_meta, [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] self._fetch_image_if_missing(context, vi) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] image_cache(vi, tmp_image_ds_loc) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] vm_util.copy_virtual_disk( [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] session._wait_for_task(vmdk_copy_task) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] return self.wait_for_task(task_ref) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] return evt.wait() [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] result = hub.switch() [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] return self.greenlet.switch() [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] self.f(*self.args, **self.kw) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] raise exceptions.translate_fault(task_info.error) [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Faults: ['InvalidArgument'] [ 1071.280539] env[59490]: ERROR nova.compute.manager [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] [ 1071.281503] env[59490]: INFO nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Terminating instance [ 1071.282278] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1071.282470] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1071.282689] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9981f2d0-302f-461b-9c7a-75292f4bc1d1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.284733] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1071.284916] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1071.285621] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8dcd6d5b-d9de-4907-a541-f80f1cedea12 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.292260] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1071.292446] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-00b7a374-5111-48e3-bfba-3d1d5b57a33d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.294418] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1071.294574] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1071.295550] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-de7b5864-b84f-44b9-aff4-29ee8f3a9a09 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.300661] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 1071.300661] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]522fd4dc-ae11-6896-da1c-cb57bf68b3ac" [ 1071.300661] env[59490]: _type = "Task" [ 1071.300661] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1071.310404] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]522fd4dc-ae11-6896-da1c-cb57bf68b3ac, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1071.362106] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1071.362328] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1071.362485] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleting the datastore file [datastore2] 504e16b8-70d2-437f-ab3e-7631cb74abec {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1071.362749] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-754b1355-0d4f-4b46-a716-18a1a5cf1daa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.369294] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1071.369294] env[59490]: value = "task-707467" [ 1071.369294] env[59490]: _type = "Task" [ 1071.369294] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1071.377504] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707467, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1071.764705] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1071.812409] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1071.812690] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating directory with path [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1071.812915] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ad48fc40-ad17-4f90-ba51-8f36fa1fa6c6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.824413] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Created directory with path [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1071.824587] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Fetch image to [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1071.824750] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1071.825488] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48e9e073-3b52-4720-8027-cefea3a94326 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.831993] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51b1b5e1-c2f6-4fa8-a954-3f64cd64b09b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.840703] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b87ceb4-d5cf-4587-a186-f1a151a1f849 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.873335] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82ec795a-297d-47ce-8275-74046e6e3781 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.880046] env[59490]: DEBUG oslo_vmware.api [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707467, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065523} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1071.881717] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1071.881717] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1071.881867] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1071.881998] env[59490]: INFO nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1071.884076] env[59490]: DEBUG nova.compute.claims [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1071.884236] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1071.884440] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1071.886863] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e0552a77-c018-416f-86b9-5dcc6d3b90c0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1071.908289] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1071.909052] env[59490]: DEBUG nova.compute.utils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance 504e16b8-70d2-437f-ab3e-7631cb74abec could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1071.912112] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1071.914313] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1071.914502] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1071.914681] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1071.915042] env[59490]: DEBUG nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1071.915042] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1071.950731] env[59490]: DEBUG nova.network.neutron [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1071.961202] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1071.963251] env[59490]: INFO nova.compute.manager [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Took 0.05 seconds to deallocate network for instance. [ 1072.017737] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1072.017907] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1072.035151] env[59490]: DEBUG oslo_concurrency.lockutils [None req-c6f140f4-af32-496f-94c6-2eb09bfd4ec6 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "504e16b8-70d2-437f-ab3e-7631cb74abec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 381.188s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1072.044221] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1072.093029] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1072.093290] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1072.094763] env[59490]: INFO nova.compute.claims [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1072.286508] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb990c22-0720-46c2-a48b-4f997b527c09 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.293708] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ad8e1ef-f8ee-4b8b-beb0-72de18bab0ce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.322039] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b819dc9-b906-4338-87de-a115f4644cca {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.328442] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37d18881-6aed-43cf-88e4-45158f2bf98f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.340816] env[59490]: DEBUG nova.compute.provider_tree [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1072.349495] env[59490]: DEBUG nova.scheduler.client.report [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1072.361785] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.268s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1072.362237] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1072.383728] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1072.383876] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1072.383984] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1072.391609] env[59490]: DEBUG nova.compute.utils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1072.392490] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1072.392651] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1072.399740] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1072.403285] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.403430] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.403560] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.403685] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.403801] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.403917] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.404047] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.404167] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.404279] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1072.404393] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1072.445775] env[59490]: DEBUG nova.policy [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '70854147748745dc927f112c021113d9', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'cdb5f189084d4decab94abbff41e128b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1072.455660] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1072.475517] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:52Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='84',id=12,is_public=True,memory_mb=192,name='m1.micro',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1072.475734] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1072.475882] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1072.476069] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1072.476210] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1072.476350] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1072.476583] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1072.476735] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1072.476897] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1072.477062] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1072.477235] env[59490]: DEBUG nova.virt.hardware [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1072.478085] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0a67194-c94a-48e2-b1ee-d325e85b79ee {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.485547] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2200298a-b853-44cb-9d0a-c02c7c8e7a58 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1072.727945] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Successfully created port: 3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1073.381157] env[59490]: DEBUG nova.compute.manager [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Received event network-vif-plugged-3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1073.381157] env[59490]: DEBUG oslo_concurrency.lockutils [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] Acquiring lock "d0673be9-d670-4d3f-aefa-26f4e336a695-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1073.381157] env[59490]: DEBUG oslo_concurrency.lockutils [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] Lock "d0673be9-d670-4d3f-aefa-26f4e336a695-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1073.381157] env[59490]: DEBUG oslo_concurrency.lockutils [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] Lock "d0673be9-d670-4d3f-aefa-26f4e336a695-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1073.381157] env[59490]: DEBUG nova.compute.manager [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] No waiting events found dispatching network-vif-plugged-3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1073.381157] env[59490]: WARNING nova.compute.manager [req-a9e02f82-b018-40fa-8af4-75f1585c1cc6 req-6d1019c0-9120-4632-82b3-2080a80f9a19 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Received unexpected event network-vif-plugged-3b47807f-a3da-48ca-a186-564cbe8f3376 for instance with vm_state building and task_state spawning. [ 1073.397097] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Successfully updated port: 3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1073.411828] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1073.412631] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1073.413131] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1073.454996] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1073.618859] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Updating instance_info_cache with network_info: [{"id": "3b47807f-a3da-48ca-a186-564cbe8f3376", "address": "fa:16:3e:0d:c4:ef", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b47807f-a3", "ovs_interfaceid": "3b47807f-a3da-48ca-a186-564cbe8f3376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1073.628976] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1073.629259] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance network_info: |[{"id": "3b47807f-a3da-48ca-a186-564cbe8f3376", "address": "fa:16:3e:0d:c4:ef", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b47807f-a3", "ovs_interfaceid": "3b47807f-a3da-48ca-a186-564cbe8f3376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1073.629748] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0d:c4:ef', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'd275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3b47807f-a3da-48ca-a186-564cbe8f3376', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1073.637213] env[59490]: DEBUG oslo.service.loopingcall [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1073.637696] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1073.637923] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-136b5b7a-c317-4f66-a610-8674fec7a150 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1073.658164] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1073.658164] env[59490]: value = "task-707468" [ 1073.658164] env[59490]: _type = "Task" [ 1073.658164] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1073.665428] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707468, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1074.168700] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707468, 'name': CreateVM_Task, 'duration_secs': 0.297177} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1074.168885] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1074.170022] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1074.170022] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1074.170201] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1074.170284] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7903cb11-9aeb-44f1-9b26-818e13ea122a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1074.174406] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1074.174406] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52f46836-c522-d7e9-415f-7ad0e9073776" [ 1074.174406] env[59490]: _type = "Task" [ 1074.174406] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1074.181676] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52f46836-c522-d7e9-415f-7ad0e9073776, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1074.383489] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1074.684825] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1074.685078] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1074.685281] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1075.406205] env[59490]: DEBUG nova.compute.manager [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Received event network-changed-3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1075.406506] env[59490]: DEBUG nova.compute.manager [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Refreshing instance network info cache due to event network-changed-3b47807f-a3da-48ca-a186-564cbe8f3376. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1075.406688] env[59490]: DEBUG oslo_concurrency.lockutils [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] Acquiring lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1075.406724] env[59490]: DEBUG oslo_concurrency.lockutils [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] Acquired lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1075.406941] env[59490]: DEBUG nova.network.neutron [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Refreshing network info cache for port 3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1075.693338] env[59490]: DEBUG nova.network.neutron [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Updated VIF entry in instance network info cache for port 3b47807f-a3da-48ca-a186-564cbe8f3376. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1075.693669] env[59490]: DEBUG nova.network.neutron [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Updating instance_info_cache with network_info: [{"id": "3b47807f-a3da-48ca-a186-564cbe8f3376", "address": "fa:16:3e:0d:c4:ef", "network": {"id": "ccb2545b-09bc-4488-9566-979cca2660c5", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-917985727-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "cdb5f189084d4decab94abbff41e128b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "d275d7c6-2a7b-4ee8-b6f4-fabf1ba1905f", "external-id": "nsx-vlan-transportzone-513", "segmentation_id": 513, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3b47807f-a3", "ovs_interfaceid": "3b47807f-a3da-48ca-a186-564cbe8f3376", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1075.705240] env[59490]: DEBUG oslo_concurrency.lockutils [req-f480c10a-ef25-4a24-a360-d888777972bf req-fe038467-4698-4175-8147-7ba6a3ce6766 service nova] Releasing lock "refresh_cache-d0673be9-d670-4d3f-aefa-26f4e336a695" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1077.384700] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1091.776284] env[59490]: DEBUG nova.compute.manager [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-vif-deleted-622c9619-1870-4434-aee0-8d5ab7122977 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1091.776607] env[59490]: INFO nova.compute.manager [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Neutron deleted interface 622c9619-1870-4434-aee0-8d5ab7122977; detaching it from the instance and deleting it from the info cache [ 1091.777007] env[59490]: DEBUG nova.network.neutron [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updating instance_info_cache with network_info: [{"id": "302b78eb-2b54-407c-b685-09c8f1da1100", "address": "fa:16:3e:92:a4:2a", "network": {"id": "0a2b28eb-fd9a-4e96-b1ab-a1422e9505de", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-999710046", "subnets": [{"cidr": "192.168.128.0/24", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.67", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "9c82d4ee7f154795b0d110a31b975096", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "503991c4-44d0-42d9-aa03-5259331f1051", "external-id": "nsx-vlan-transportzone-3", "segmentation_id": 3, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap302b78eb-2b", "ovs_interfaceid": "302b78eb-2b54-407c-b685-09c8f1da1100", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1091.787595] env[59490]: DEBUG oslo_concurrency.lockutils [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] Acquiring lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1093.892207] env[59490]: DEBUG nova.compute.manager [req-8258976e-6f5e-4ae2-8680-8ef01892f986 req-dd3367dc-21b7-4ecb-88de-8ff48a945dde service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Received event network-vif-deleted-302b78eb-2b54-407c-b685-09c8f1da1100 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1093.892485] env[59490]: DEBUG nova.compute.manager [req-8258976e-6f5e-4ae2-8680-8ef01892f986 req-dd3367dc-21b7-4ecb-88de-8ff48a945dde service nova] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Received event network-vif-deleted-dd75bf58-eeff-451a-a180-f0480a97597f {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1096.069529] env[59490]: DEBUG nova.compute.manager [req-9ad65e82-acc7-4dff-9ed9-1e32587b1b7e req-67c1d585-0edb-4278-91c0-cf7e13682d96 service nova] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Received event network-vif-deleted-194fb95e-e711-4985-b42b-b1ffe7bf588e {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1096.069529] env[59490]: DEBUG nova.compute.manager [req-9ad65e82-acc7-4dff-9ed9-1e32587b1b7e req-67c1d585-0edb-4278-91c0-cf7e13682d96 service nova] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Received event network-vif-deleted-0c2a348c-67d7-4b37-abf4-bd02b3180cbd {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1096.579097] env[59490]: DEBUG nova.compute.manager [req-7a88ee29-1685-41d6-bde7-4a2f729e9182 req-b2cb3d3c-4f73-4275-a2fc-804f8ef99537 service nova] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Received event network-vif-deleted-3b47807f-a3da-48ca-a186-564cbe8f3376 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1098.106119] env[59490]: DEBUG nova.compute.manager [req-5c6052fd-4793-42e3-ab80-eb4263ba6e7a req-e0d7202d-993c-43fc-81ae-02dd234796f7 service nova] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Received event network-vif-deleted-ce8b46a6-fc65-4276-8335-90b7ad600f33 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1098.106351] env[59490]: DEBUG nova.compute.manager [req-5c6052fd-4793-42e3-ab80-eb4263ba6e7a req-e0d7202d-993c-43fc-81ae-02dd234796f7 service nova] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Received event network-vif-deleted-4886220c-5ffc-4c5b-b1f1-1f48293d8bcb {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1098.106499] env[59490]: DEBUG nova.compute.manager [req-5c6052fd-4793-42e3-ab80-eb4263ba6e7a req-e0d7202d-993c-43fc-81ae-02dd234796f7 service nova] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Received event network-vif-deleted-5fe5903c-685c-4c8b-9ce8-7e08c9bea2cc {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1120.742216] env[59490]: WARNING oslo_vmware.rw_handles [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1120.742216] env[59490]: ERROR oslo_vmware.rw_handles [ 1120.742909] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1120.744493] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1120.744769] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Copying Virtual Disk [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/6e56fbd1-91e6-4c6e-8b2b-09297f42159c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1120.745070] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-d4f558f7-299a-4447-bd8f-6ef1ce8f6e2f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1120.754530] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 1120.754530] env[59490]: value = "task-707480" [ 1120.754530] env[59490]: _type = "Task" [ 1120.754530] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1120.763439] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707480, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1121.266872] env[59490]: DEBUG oslo_vmware.exceptions [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1121.267283] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1121.268014] env[59490]: ERROR nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1121.268014] env[59490]: Faults: ['InvalidArgument'] [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Traceback (most recent call last): [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] yield resources [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] self.driver.spawn(context, instance, image_meta, [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] self._fetch_image_if_missing(context, vi) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] image_cache(vi, tmp_image_ds_loc) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] vm_util.copy_virtual_disk( [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] session._wait_for_task(vmdk_copy_task) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] return self.wait_for_task(task_ref) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] return evt.wait() [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] result = hub.switch() [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] return self.greenlet.switch() [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] self.f(*self.args, **self.kw) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] raise exceptions.translate_fault(task_info.error) [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Faults: ['InvalidArgument'] [ 1121.268014] env[59490]: ERROR nova.compute.manager [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] [ 1121.269201] env[59490]: INFO nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Terminating instance [ 1121.271914] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1121.272241] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1121.273056] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1121.273359] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1121.273679] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-251e458b-87dd-45e8-8607-736dce14c9e0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.276256] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b0f9854-91f9-4c57-882d-1552cb4afeee {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.285026] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1121.285026] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d30a84b9-f9e8-4fdc-bcb8-79c90c92643d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.286940] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1121.287255] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1121.288540] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4ab1f6cf-e625-4e7c-90c6-22333590c8e1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.294205] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1121.294205] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52e45aed-6c24-a60e-9a62-413ed2587ecd" [ 1121.294205] env[59490]: _type = "Task" [ 1121.294205] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1121.302817] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52e45aed-6c24-a60e-9a62-413ed2587ecd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1121.459719] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1121.460476] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1121.460476] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Deleting the datastore file [datastore2] 0ead3d36-7d65-4e6d-be85-a6736acd3802 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1121.460476] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-22ed015b-b91a-43e8-ad71-19b798ec43a3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.467729] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Waiting for the task: (returnval){ [ 1121.467729] env[59490]: value = "task-707482" [ 1121.467729] env[59490]: _type = "Task" [ 1121.467729] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1121.477861] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707482, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1121.805027] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1121.805286] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1121.805515] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b396fdb2-ca2e-47d6-9cec-84c6468f22ed {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.818203] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1121.818410] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Fetch image to [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1121.818571] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1121.819356] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfbe3fca-46f2-4354-b696-6f88466cf5ec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.826753] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e668269-cdde-44ef-9151-ef6a4b3616a5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.836488] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8710dbd5-fbd1-4ec7-aa5c-8157186973c4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.867204] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8b7f82c-20ee-4ea6-98bb-1e5a52211487 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.873706] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cf5fc22f-c26c-4e25-bad0-a6d3f852e5d4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1121.896987] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1121.944041] env[59490]: DEBUG oslo_vmware.rw_handles [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1122.001238] env[59490]: DEBUG oslo_vmware.rw_handles [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1122.001398] env[59490]: DEBUG oslo_vmware.rw_handles [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1122.006217] env[59490]: DEBUG oslo_vmware.api [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Task: {'id': task-707482, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075797} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1122.006507] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1122.006746] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1122.006960] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1122.007179] env[59490]: INFO nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Took 0.73 seconds to destroy the instance on the hypervisor. [ 1122.009597] env[59490]: DEBUG nova.compute.claims [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1122.009737] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1122.009971] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1122.037865] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.038587] env[59490]: DEBUG nova.compute.utils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance 0ead3d36-7d65-4e6d-be85-a6736acd3802 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1122.040075] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1122.040243] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1122.040400] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1122.040564] env[59490]: DEBUG nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1122.040720] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1122.068259] env[59490]: DEBUG nova.network.neutron [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1122.076858] env[59490]: INFO nova.compute.manager [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Took 0.04 seconds to deallocate network for instance. [ 1122.117541] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e727cbab-9623-4ec9-a398-b28718f4abce tempest-ServersTestJSON-1646516409 tempest-ServersTestJSON-1646516409-project-member] Lock "0ead3d36-7d65-4e6d-be85-a6736acd3802" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 346.729s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.128693] env[59490]: DEBUG nova.compute.manager [None req-80487f3d-8756-42c9-9e2b-083312244fe0 tempest-ServerTagsTestJSON-1826652627 tempest-ServerTagsTestJSON-1826652627-project-member] [instance: ecb7312c-80f0-490e-8357-7138680d0f90] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1122.150218] env[59490]: DEBUG nova.compute.manager [None req-80487f3d-8756-42c9-9e2b-083312244fe0 tempest-ServerTagsTestJSON-1826652627 tempest-ServerTagsTestJSON-1826652627-project-member] [instance: ecb7312c-80f0-490e-8357-7138680d0f90] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1122.171927] env[59490]: DEBUG oslo_concurrency.lockutils [None req-80487f3d-8756-42c9-9e2b-083312244fe0 tempest-ServerTagsTestJSON-1826652627 tempest-ServerTagsTestJSON-1826652627-project-member] Lock "ecb7312c-80f0-490e-8357-7138680d0f90" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.006s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.180166] env[59490]: DEBUG nova.compute.manager [None req-e17cc6b6-765a-479a-96ec-ba00b4be8ca5 tempest-ServerAddressesNegativeTestJSON-326820771 tempest-ServerAddressesNegativeTestJSON-326820771-project-member] [instance: e24d5bbc-6168-4523-9a0c-cd29c14c9e56] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1122.206183] env[59490]: DEBUG nova.compute.manager [None req-e17cc6b6-765a-479a-96ec-ba00b4be8ca5 tempest-ServerAddressesNegativeTestJSON-326820771 tempest-ServerAddressesNegativeTestJSON-326820771-project-member] [instance: e24d5bbc-6168-4523-9a0c-cd29c14c9e56] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1122.226250] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e17cc6b6-765a-479a-96ec-ba00b4be8ca5 tempest-ServerAddressesNegativeTestJSON-326820771 tempest-ServerAddressesNegativeTestJSON-326820771-project-member] Lock "e24d5bbc-6168-4523-9a0c-cd29c14c9e56" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.681s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1122.233934] env[59490]: DEBUG nova.compute.manager [None req-1d466185-1143-432b-a936-8d3a1690079b tempest-ServerMetadataNegativeTestJSON-221694196 tempest-ServerMetadataNegativeTestJSON-221694196-project-member] [instance: 643bfd74-592a-452c-af62-ded4c23009f9] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1122.254682] env[59490]: DEBUG nova.compute.manager [None req-1d466185-1143-432b-a936-8d3a1690079b tempest-ServerMetadataNegativeTestJSON-221694196 tempest-ServerMetadataNegativeTestJSON-221694196-project-member] [instance: 643bfd74-592a-452c-af62-ded4c23009f9] Instance disappeared before build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2409}} [ 1122.273590] env[59490]: DEBUG oslo_concurrency.lockutils [None req-1d466185-1143-432b-a936-8d3a1690079b tempest-ServerMetadataNegativeTestJSON-221694196 tempest-ServerMetadataNegativeTestJSON-221694196-project-member] Lock "643bfd74-592a-452c-af62-ded4c23009f9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.140s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1128.386304] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1128.386752] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances with incomplete migration {{(pid=59490) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11137}} [ 1129.393279] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1129.393633] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1130.385029] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1130.385029] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1130.396051] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1130.396345] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1130.396441] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1130.396574] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1130.397663] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f76f050b-ccc0-403c-9295-2955ab670d2c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.409654] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b952d051-05bc-4f36-a3fc-3e13e6ccc0da {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.427854] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-903c6c81-5fa6-4887-a5c4-5f1dd3d020e7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.436136] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13842010-7fd3-45a2-b8b8-a73ea9aad56c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.468740] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181650MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1130.468952] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1130.469101] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1130.508699] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1130.509117] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1130.509292] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1130.538776] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43c21e56-4755-49f3-b1f4-f775393bd3f0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.547017] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec9bc113-c7b7-47f0-a543-ad38e0ef5979 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.585496] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33c15547-2616-447e-831b-11523e863120 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.594174] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6bed1db-cc17-4dfb-8f6e-3fdf50ddc488 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1130.608406] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1130.617138] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1130.630430] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1130.630639] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.161s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1131.625356] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1132.384075] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1132.384315] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1133.384761] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1133.385150] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1133.385150] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1133.395142] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1133.395286] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1134.384500] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1134.384733] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11099}} [ 1134.428013] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] There are 19 instances to clean {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1134.428357] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.464769] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.501487] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.539577] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.559335] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.577934] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.600172] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.617862] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.635330] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ead3d36-7d65-4e6d-be85-a6736acd3802] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.654221] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 504e16b8-70d2-437f-ab3e-7631cb74abec] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.671944] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 3464c5af-60a4-4b6d-b7ca-51cf7312cf09] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.689433] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 1c7b3da9-32ab-4aa0-90e3-f27bf5996590] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.707476] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 581848be-38fb-42da-b723-480bf297d1a5] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.727081] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 2f083456-3eb9-4022-86a3-8d39f83c470f] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.745795] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 31207de9-e903-4ed4-bccc-c0796edec34b] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.763496] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 67c0aeae-1212-4dc6-8f1b-2fc494fbd1a7] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.781367] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: ad8223ea-b097-439f-bcff-9c06bd1cf5e6] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.801115] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 0ec55812-86b7-44ef-822a-88a2ff1816c3] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1134.818834] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: 71698ce4-94a0-442c-8081-374616ce2ac4] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1135.836406] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1136.381065] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1136.391570] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1138.393898] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1169.539964] env[59490]: WARNING oslo_vmware.rw_handles [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1169.539964] env[59490]: ERROR oslo_vmware.rw_handles [ 1169.540791] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1169.542014] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1169.542252] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Copying Virtual Disk [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/26057f68-efc0-45c6-930f-9f4655a8f78f/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1169.542538] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-4df6af99-0926-4eeb-ae7b-f457daf447c9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1169.551219] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1169.551219] env[59490]: value = "task-707483" [ 1169.551219] env[59490]: _type = "Task" [ 1169.551219] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1169.559505] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707483, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1170.061279] env[59490]: DEBUG oslo_vmware.exceptions [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1170.061537] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1170.062107] env[59490]: ERROR nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1170.062107] env[59490]: Faults: ['InvalidArgument'] [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Traceback (most recent call last): [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] yield resources [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self.driver.spawn(context, instance, image_meta, [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self._fetch_image_if_missing(context, vi) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] image_cache(vi, tmp_image_ds_loc) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] vm_util.copy_virtual_disk( [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] session._wait_for_task(vmdk_copy_task) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return self.wait_for_task(task_ref) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return evt.wait() [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] result = hub.switch() [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return self.greenlet.switch() [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self.f(*self.args, **self.kw) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] raise exceptions.translate_fault(task_info.error) [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Faults: ['InvalidArgument'] [ 1170.062107] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] [ 1170.063112] env[59490]: INFO nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Terminating instance [ 1170.064103] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1170.064428] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1170.064756] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2e6985b1-aa3a-4650-a359-f8cd06b06ada {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.066980] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1170.067190] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1170.067884] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ba4c47-d794-4daa-94d8-de5e14689a51 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.074649] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1170.074877] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4f22fe9a-1564-4d5a-9187-02d6d6b0f011 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.076942] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1170.077115] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1170.078029] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7f52a14-d7ae-45ed-a746-39e944a2288e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.082650] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Waiting for the task: (returnval){ [ 1170.082650] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]522de1be-f94e-a955-2d9c-6bedf835408c" [ 1170.082650] env[59490]: _type = "Task" [ 1170.082650] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1170.089578] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]522de1be-f94e-a955-2d9c-6bedf835408c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1170.138511] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1170.138709] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1170.138899] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleting the datastore file [datastore2] 2907e146-ad50-47f3-9390-7ae3ae99ce97 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1170.139183] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f188da73-c4dc-4a6f-9ddc-540a7d5ea6df {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.144440] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1170.144440] env[59490]: value = "task-707485" [ 1170.144440] env[59490]: _type = "Task" [ 1170.144440] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1170.151775] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707485, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1170.594473] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1170.594809] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Creating directory with path [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1170.594922] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d1daecd8-c63c-4f78-bed1-9f5005074cab {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.605878] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Created directory with path [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1170.606069] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Fetch image to [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1170.606236] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1170.606972] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-15bf1c1c-ee55-4988-b841-18909eb57804 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.613270] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef6fe145-4efa-4b7f-9ebd-2e215c541d71 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.622357] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b4b370b-2b64-42be-b70a-27867033e425 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.654164] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb8a7c9c-2a35-435d-bebc-f048e39fe19c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.661852] env[59490]: DEBUG oslo_vmware.api [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707485, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.074986} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1170.663161] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1170.663337] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1170.663498] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1170.663664] env[59490]: INFO nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1170.665593] env[59490]: DEBUG nova.compute.claims [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1170.665777] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1170.665975] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1170.668429] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-298f074a-d986-41bd-bf10-ceee8518d88a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.690874] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1170.733518] env[59490]: DEBUG oslo_vmware.rw_handles [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1170.788414] env[59490]: DEBUG oslo_vmware.rw_handles [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1170.788588] env[59490]: DEBUG oslo_vmware.rw_handles [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1170.796853] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2a6307c3-aa28-4cbc-a486-97e767c0e748 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.804058] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77e7723a-7b24-4b28-aacf-cc0d7aeb8c30 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.832734] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a6a6cad-525a-4419-aaf9-b0ec907bf918 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.839541] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01326107-1e35-4d85-8258-2431cc1f34e1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1170.851932] env[59490]: DEBUG nova.compute.provider_tree [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1170.860208] env[59490]: DEBUG nova.scheduler.client.report [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1170.872365] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.206s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1170.872906] env[59490]: ERROR nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1170.872906] env[59490]: Faults: ['InvalidArgument'] [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Traceback (most recent call last): [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self.driver.spawn(context, instance, image_meta, [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self._fetch_image_if_missing(context, vi) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] image_cache(vi, tmp_image_ds_loc) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] vm_util.copy_virtual_disk( [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] session._wait_for_task(vmdk_copy_task) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return self.wait_for_task(task_ref) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return evt.wait() [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] result = hub.switch() [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] return self.greenlet.switch() [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] self.f(*self.args, **self.kw) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] raise exceptions.translate_fault(task_info.error) [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Faults: ['InvalidArgument'] [ 1170.872906] env[59490]: ERROR nova.compute.manager [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] [ 1170.873938] env[59490]: DEBUG nova.compute.utils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1170.874938] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Build of instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 was re-scheduled: A specified parameter was not correct: fileType [ 1170.874938] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1170.875393] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1170.875557] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1170.875716] env[59490]: DEBUG nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1170.875869] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1171.136197] env[59490]: DEBUG nova.network.neutron [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1171.153138] env[59490]: INFO nova.compute.manager [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Took 0.28 seconds to deallocate network for instance. [ 1171.237617] env[59490]: INFO nova.scheduler.client.report [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleted allocations for instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 [ 1171.253271] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d3ba0d5e-bb11-493e-9d10-4ab3641a0f0b tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 347.129s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1171.253528] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 151.370s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1171.253791] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1171.253993] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1171.254168] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1171.255982] env[59490]: INFO nova.compute.manager [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Terminating instance [ 1171.257651] env[59490]: DEBUG nova.compute.manager [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1171.257843] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1171.258323] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-3e9d1136-e2e8-4021-9850-4c91d852805e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.267630] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-672417e9-be74-4f63-b20d-39f7c903a62f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1171.295276] env[59490]: WARNING nova.virt.vmwareapi.vmops [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 2907e146-ad50-47f3-9390-7ae3ae99ce97 could not be found. [ 1171.295479] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1171.295653] env[59490]: INFO nova.compute.manager [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1171.295883] env[59490]: DEBUG oslo.service.loopingcall [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1171.296129] env[59490]: DEBUG nova.compute.manager [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1171.296255] env[59490]: DEBUG nova.network.neutron [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1171.320526] env[59490]: DEBUG nova.network.neutron [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1171.327977] env[59490]: INFO nova.compute.manager [-] [instance: 2907e146-ad50-47f3-9390-7ae3ae99ce97] Took 0.03 seconds to deallocate network for instance. [ 1171.465461] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6d2eba9e-bc62-44f9-9369-88def7fc8ccb tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "2907e146-ad50-47f3-9390-7ae3ae99ce97" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.212s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1171.890278] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "bc0157a8-969b-448c-82cf-c773e07d6d02" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1171.890602] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1171.900051] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1171.949218] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1171.949467] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1171.950921] env[59490]: INFO nova.compute.claims [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1172.015201] env[59490]: DEBUG nova.scheduler.client.report [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Refreshing inventories for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1172.029500] env[59490]: DEBUG nova.scheduler.client.report [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Updating ProviderTree inventory for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1172.029717] env[59490]: DEBUG nova.compute.provider_tree [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Updating inventory in ProviderTree for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1172.041067] env[59490]: DEBUG nova.scheduler.client.report [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Refreshing aggregate associations for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20, aggregates: None {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1172.058334] env[59490]: DEBUG nova.scheduler.client.report [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Refreshing trait associations for resource provider 715aacdb-6e76-47b7-ae6f-492abc122a20, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NODE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NET_ATTACH_INTERFACE {{(pid=59490) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1172.088185] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77b15936-3c00-4b3e-9b94-27582f660860 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.099254] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b287dc03-560e-46f6-a13d-d50b89f1150c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.148389] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ee1bbc0-1b49-4701-bfe3-05cf756f9caf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.159435] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ff81d28-496d-4d77-9e6a-44808ab7a4c5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.181641] env[59490]: DEBUG nova.compute.provider_tree [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1172.192783] env[59490]: DEBUG nova.scheduler.client.report [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1172.209540] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.260s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1172.210083] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1172.248268] env[59490]: DEBUG nova.compute.utils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1172.249507] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1172.249680] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1172.258858] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1172.308131] env[59490]: DEBUG nova.policy [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93256992d7a84e72882b4c132c337393', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c2133066748948909baea488349a4b78', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1172.324214] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1172.345075] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1172.345322] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1172.345468] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1172.345640] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1172.345779] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1172.345918] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1172.346150] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1172.346306] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1172.346461] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1172.346645] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1172.346817] env[59490]: DEBUG nova.virt.hardware [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1172.347701] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-332e42ff-cfe0-4a59-b715-1cb7e0eba901 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.355767] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2658e2b-6289-4765-bfe4-f805fad547ab {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1172.648946] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Successfully created port: d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1173.169363] env[59490]: DEBUG nova.compute.manager [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Received event network-vif-plugged-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1173.169735] env[59490]: DEBUG oslo_concurrency.lockutils [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] Acquiring lock "bc0157a8-969b-448c-82cf-c773e07d6d02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1173.169800] env[59490]: DEBUG oslo_concurrency.lockutils [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1173.169919] env[59490]: DEBUG oslo_concurrency.lockutils [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1173.170203] env[59490]: DEBUG nova.compute.manager [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] No waiting events found dispatching network-vif-plugged-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1173.170442] env[59490]: WARNING nova.compute.manager [req-c632720c-02a4-4eb5-9f49-875d12ec16d8 req-a2ea7a8d-8df5-469d-8ffd-a0db438b021d service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Received unexpected event network-vif-plugged-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 for instance with vm_state building and task_state spawning. [ 1173.218406] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Successfully updated port: d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1173.231152] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1173.231308] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1173.231444] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1173.263946] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1173.415830] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Updating instance_info_cache with network_info: [{"id": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "address": "fa:16:3e:02:e8:35", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1c4d20b-00", "ovs_interfaceid": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1173.426679] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1173.426908] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance network_info: |[{"id": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "address": "fa:16:3e:02:e8:35", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1c4d20b-00", "ovs_interfaceid": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1173.427271] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:02:e8:35', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '3ac3fd84-c373-49f5-82dc-784a6cdb686d', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd1c4d20b-00d5-4c04-aaf1-0d4a8d304b99', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1173.434722] env[59490]: DEBUG oslo.service.loopingcall [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1173.435297] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1173.435519] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d7a9a2d1-d800-4928-892f-22dc18944e8b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1173.455421] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1173.455421] env[59490]: value = "task-707486" [ 1173.455421] env[59490]: _type = "Task" [ 1173.455421] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1173.466581] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707486, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1173.966060] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707486, 'name': CreateVM_Task, 'duration_secs': 0.284709} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1173.966233] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1173.966939] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1173.967096] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1173.967423] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1173.967649] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a597996e-0f4e-4ffe-8a8b-4f63dde159dd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1173.971756] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1173.971756] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52e4a3d3-8793-ff6e-5664-62f6e541f13b" [ 1173.971756] env[59490]: _type = "Task" [ 1173.971756] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1173.979327] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52e4a3d3-8793-ff6e-5664-62f6e541f13b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1174.482547] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1174.482910] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1174.482992] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1175.199585] env[59490]: DEBUG nova.compute.manager [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Received event network-changed-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1175.199770] env[59490]: DEBUG nova.compute.manager [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Refreshing instance network info cache due to event network-changed-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1175.200035] env[59490]: DEBUG oslo_concurrency.lockutils [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] Acquiring lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1175.200195] env[59490]: DEBUG oslo_concurrency.lockutils [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] Acquired lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1175.200354] env[59490]: DEBUG nova.network.neutron [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Refreshing network info cache for port d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1175.605851] env[59490]: DEBUG nova.network.neutron [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Updated VIF entry in instance network info cache for port d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1175.606212] env[59490]: DEBUG nova.network.neutron [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Updating instance_info_cache with network_info: [{"id": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "address": "fa:16:3e:02:e8:35", "network": {"id": "b450e60c-46b8-4062-b33f-d571e301c94b", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-2054261491-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "c2133066748948909baea488349a4b78", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "3ac3fd84-c373-49f5-82dc-784a6cdb686d", "external-id": "nsx-vlan-transportzone-298", "segmentation_id": 298, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd1c4d20b-00", "ovs_interfaceid": "d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1175.615199] env[59490]: DEBUG oslo_concurrency.lockutils [req-6d047731-ef3d-4a6b-89f2-31cef4027c68 req-bcbc5833-78fc-48a9-b917-677f5c8516d2 service nova] Releasing lock "refresh_cache-bc0157a8-969b-448c-82cf-c773e07d6d02" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1180.584602] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_power_states {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1180.594930] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Getting list of instances from cluster (obj){ [ 1180.594930] env[59490]: value = "domain-c8" [ 1180.594930] env[59490]: _type = "ClusterComputeResource" [ 1180.594930] env[59490]: } {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1180.595959] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab70916b-cd18-47d4-a6f2-e24182087a43 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1180.611523] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Got total of 9 instances {{(pid=59490) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1180.611668] env[59490]: WARNING nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] While synchronizing instance power states, found 1 instances in the database and 9 instances on the hypervisor. [ 1180.611801] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Triggering sync for uuid bc0157a8-969b-448c-82cf-c773e07d6d02 {{(pid=59490) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10222}} [ 1180.612133] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "bc0157a8-969b-448c-82cf-c773e07d6d02" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1190.385592] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1190.385961] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1191.380085] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.384810] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.385262] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1192.395257] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.395457] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.395598] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1192.395744] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1192.396805] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b099dc-458d-4b1f-bd26-1417c231d575 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.405276] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acd3d7a9-4237-4f30-910f-d16e54351efa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.419337] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33a86a49-0006-457f-ae40-5ffcccddf4ed {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.425238] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-522366ce-ec9c-4591-bd0e-50e8084e0a0f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.453055] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181629MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1192.453227] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1192.453380] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1192.488725] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance bc0157a8-969b-448c-82cf-c773e07d6d02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1192.488916] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1192.489067] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1192.514352] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ffcaccb-940d-4b35-8c7c-1a8097af6f1b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.521560] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a856c29-cabc-4eb3-9880-2150aa536f27 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.551294] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4fc2581-a1c5-448c-894a-19c44772938d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.559180] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e59f42c-80c3-48b3-bd8d-93db91fbf051 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1192.577108] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1192.585435] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1192.598841] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1192.599079] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.146s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1193.598982] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1194.384077] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1194.384331] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1194.384331] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1194.394264] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1194.394457] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1194.394579] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1195.384773] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1200.384779] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1220.772206] env[59490]: WARNING oslo_vmware.rw_handles [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1220.772206] env[59490]: ERROR oslo_vmware.rw_handles [ 1220.772875] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1220.774236] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1220.774477] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Copying Virtual Disk [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/3fff40cc-d756-4fe9-bace-0893945375dd/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1220.774748] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-76024c20-ceea-4849-ad26-ab642e5e4d47 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1220.784014] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Waiting for the task: (returnval){ [ 1220.784014] env[59490]: value = "task-707487" [ 1220.784014] env[59490]: _type = "Task" [ 1220.784014] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1220.791362] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Task: {'id': task-707487, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1221.294380] env[59490]: DEBUG oslo_vmware.exceptions [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1221.294661] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1221.295260] env[59490]: ERROR nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1221.295260] env[59490]: Faults: ['InvalidArgument'] [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Traceback (most recent call last): [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] yield resources [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] self.driver.spawn(context, instance, image_meta, [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] self._fetch_image_if_missing(context, vi) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] image_cache(vi, tmp_image_ds_loc) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] vm_util.copy_virtual_disk( [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] session._wait_for_task(vmdk_copy_task) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] return self.wait_for_task(task_ref) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] return evt.wait() [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] result = hub.switch() [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] return self.greenlet.switch() [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] self.f(*self.args, **self.kw) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] raise exceptions.translate_fault(task_info.error) [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Faults: ['InvalidArgument'] [ 1221.295260] env[59490]: ERROR nova.compute.manager [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] [ 1221.296511] env[59490]: INFO nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Terminating instance [ 1221.297370] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1221.297567] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1221.297787] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f651d69a-f525-4c77-89a5-223a0c4fb2ba {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.299919] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1221.300127] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1221.300812] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-062f6ad9-7279-438a-bbda-f00399bab7b7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.307669] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1221.307856] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-37b542b5-ad42-49ea-a2a2-51b217082306 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.309911] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1221.310087] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1221.311014] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f75715e-1c3a-4657-a0b0-cf3fd15d8b73 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.315600] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1221.315600] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52ae56c2-1ecf-ae2c-2cae-1b277a459cae" [ 1221.315600] env[59490]: _type = "Task" [ 1221.315600] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1221.322563] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52ae56c2-1ecf-ae2c-2cae-1b277a459cae, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1221.415670] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1221.415872] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1221.416053] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Deleting the datastore file [datastore2] f63ed63f-b989-40b4-b7d5-3c5a6841ee08 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1221.416323] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c3cb0cc9-4c45-459c-8eef-ca39191ea98f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.424547] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Waiting for the task: (returnval){ [ 1221.424547] env[59490]: value = "task-707489" [ 1221.424547] env[59490]: _type = "Task" [ 1221.424547] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1221.433430] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Task: {'id': task-707489, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1221.826079] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1221.826453] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1221.826621] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11e28848-b0e9-4cb6-9ea4-7143e0ec4535 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.837478] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1221.837670] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Fetch image to [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1221.837832] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1221.838544] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db5f228c-6848-42f9-9192-a419611b2466 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.844585] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a43546-6af0-4963-a997-e75a35f19794 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.853067] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a284bae8-fd4e-4b88-b9cc-fd24b4c8b4cb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.882237] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf322fe-adfc-4fbd-b5ff-bf9236e00cf9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.887215] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-7bcc9f0c-0b32-4269-a905-e970f1f20eb6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1221.908014] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1221.935513] env[59490]: DEBUG oslo_vmware.api [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Task: {'id': task-707489, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07505} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1221.935754] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1221.935918] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1221.936096] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1221.936262] env[59490]: INFO nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Took 0.64 seconds to destroy the instance on the hypervisor. [ 1221.938294] env[59490]: DEBUG nova.compute.claims [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1221.938451] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1221.938649] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1221.955685] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1222.005409] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.067s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1222.006160] env[59490]: DEBUG nova.compute.utils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1222.007659] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1222.007853] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1222.008060] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1222.008224] env[59490]: DEBUG nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1222.008379] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1222.012009] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1222.012173] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1222.034094] env[59490]: DEBUG nova.network.neutron [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1222.042123] env[59490]: INFO nova.compute.manager [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Took 0.03 seconds to deallocate network for instance. [ 1222.082570] env[59490]: DEBUG oslo_concurrency.lockutils [None req-9934ef14-f49a-4107-9ac4-6fa80ef81007 tempest-ServersTestMultiNic-180611203 tempest-ServersTestMultiNic-180611203-project-member] Lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 327.239s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1222.082783] env[59490]: DEBUG oslo_concurrency.lockutils [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] Acquired lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1222.083634] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1904d67c-f3ea-4ffd-9c14-80f26c2e8eb7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1222.091413] env[59490]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 1222.091563] env[59490]: DEBUG oslo_vmware.api [-] Fault list: [ManagedObjectNotFound] {{(pid=59490) _invoke_api /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:337}} [ 1222.091867] env[59490]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-834988f4-e8e4-4a04-9922-e4a2b4ae5d3a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1222.098972] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70fd2f6b-e939-4269-a1a5-3823743e896d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1222.126243] env[59490]: ERROR root [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] Original exception being dropped: ['Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 377, in request_handler\n response = request(managed_object, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 586, in __call__\n return client.invoke(args, kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 728, in invoke\n result = self.send(soapenv, timeout=timeout)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 777, in send\n return self.process_reply(reply.message, None, None)\n', ' File "/usr/local/lib/python3.10/dist-packages/suds/client.py", line 840, in process_reply\n raise WebFault(fault, replyroot)\n', "suds.WebFault: Server raised fault: 'The object 'vim.VirtualMachine:vm-168963' has already been deleted or has not been completely created'\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 301, in _invoke_api\n return api_method(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 480, in get_object_property\n props = get_object_properties(vim, moref, [property_name],\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/vim_util.py", line 360, in get_object_properties\n retrieve_result = vim.RetrievePropertiesEx(\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py", line 413, in request_handler\n raise exceptions.VimFaultException(fault_list, fault_string,\n', "oslo_vmware.exceptions.VimFaultException: The object 'vim.VirtualMachine:vm-168963' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-168963' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-168963'}\n", '\nDuring handling of the above exception, another exception occurred:\n\n', 'Traceback (most recent call last):\n', ' File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 123, in _call_method\n return self.invoke_api(module, method, self.vim, *args,\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 358, in invoke_api\n return _invoke_api(module, method, *args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 122, in func\n return evt.wait()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait\n result = hub.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch\n return self.greenlet.switch()\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 122, in _inner\n idle = self.f(*self.args, **self.kw)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 96, in _func\n result = f(*args, **kwargs)\n', ' File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 341, in _invoke_api\n raise clazz(str(excep),\n', "oslo_vmware.exceptions.ManagedObjectNotFoundException: The object 'vim.VirtualMachine:vm-168963' has already been deleted or has not been completely created\nCause: Server raised fault: 'The object 'vim.VirtualMachine:vm-168963' has already been deleted or has not been completely created'\nFaults: [ManagedObjectNotFound]\nDetails: {'obj': 'vm-168963'}\n"]: nova.exception.InstanceNotFound: Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 could not be found. [ 1222.126444] env[59490]: DEBUG oslo_concurrency.lockutils [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] Releasing lock "f63ed63f-b989-40b4-b7d5-3c5a6841ee08" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1222.126642] env[59490]: DEBUG nova.compute.manager [req-b14f12a7-e2ec-4ac1-93e2-7ec597c401a7 req-4799dada-07fe-480b-bf62-aa8b5b85eefc service nova] [instance: f63ed63f-b989-40b4-b7d5-3c5a6841ee08] Detach interface failed, port_id=622c9619-1870-4434-aee0-8d5ab7122977, reason: Instance f63ed63f-b989-40b4-b7d5-3c5a6841ee08 could not be found. {{(pid=59490) _process_instance_vif_deleted_event /opt/stack/nova/nova/compute/manager.py:10836}} [ 1250.384329] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1250.384699] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1251.380077] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1252.384846] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1253.384142] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1253.394845] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1253.395171] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1253.395225] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1253.395341] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1253.396789] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24ff0388-a829-408f-8a8b-e9e3be66f38e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.404927] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1f4eb26-daac-45f0-911c-cdb2ad110e85 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.418311] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-809c3fa7-7e1c-44c3-8269-7d21731f8258 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.424106] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f873315-719f-4e49-ad45-43697f0ea7e6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.452067] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181669MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1253.452212] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1253.452396] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1253.488766] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance bc0157a8-969b-448c-82cf-c773e07d6d02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1253.488964] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1253.489124] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1253.518749] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c582751-fe3e-4211-a1f5-eb8df085fe77 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.526068] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a5d6be6-fd2d-4ea6-bd84-394ef74dda53 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.554998] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e17b6866-863c-4c46-b25e-ae5f7cb100e0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.561771] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6ae1dec-0de4-4bca-b005-781b1e019dce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1253.574206] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1253.582347] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1253.595961] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1253.596122] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1254.595735] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1254.596129] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1254.596129] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1254.605541] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1254.605726] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1255.383488] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1256.384734] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1257.385179] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1261.380016] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1262.383953] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1271.607115] env[59490]: WARNING oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1271.607115] env[59490]: ERROR oslo_vmware.rw_handles [ 1271.607766] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1271.609393] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1271.609641] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Copying Virtual Disk [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/c0997798-eaf7-40e8-a7df-0c7046ae6bcf/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1271.609937] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-dc324600-73da-4635-9a81-0d04008bdaf7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1271.620342] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1271.620342] env[59490]: value = "task-707490" [ 1271.620342] env[59490]: _type = "Task" [ 1271.620342] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1271.627915] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707490, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1272.130847] env[59490]: DEBUG oslo_vmware.exceptions [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1272.131091] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1272.131649] env[59490]: ERROR nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1272.131649] env[59490]: Faults: ['InvalidArgument'] [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Traceback (most recent call last): [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] yield resources [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] self.driver.spawn(context, instance, image_meta, [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] self._fetch_image_if_missing(context, vi) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] image_cache(vi, tmp_image_ds_loc) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] vm_util.copy_virtual_disk( [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] session._wait_for_task(vmdk_copy_task) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] return self.wait_for_task(task_ref) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] return evt.wait() [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] result = hub.switch() [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] return self.greenlet.switch() [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] self.f(*self.args, **self.kw) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] raise exceptions.translate_fault(task_info.error) [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Faults: ['InvalidArgument'] [ 1272.131649] env[59490]: ERROR nova.compute.manager [instance: ddbac2db-c555-4554-aa21-7303c8e36371] [ 1272.132719] env[59490]: INFO nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Terminating instance [ 1272.133488] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1272.133684] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1272.133903] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-aadf68cb-75da-40ec-a4ea-829b2387f56d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.135979] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1272.136177] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1272.136904] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e385eac5-9a4c-4e75-8bd8-97f3f131347e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.143511] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1272.143706] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-3462abcb-52df-49bd-95cd-e417d42eea61 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.145707] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1272.145867] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1272.146793] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6b0dbde5-dfb9-4c9f-9814-c4c1e353cd02 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.152330] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1272.152330] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]521d4964-a651-7b78-1fb7-da36747cdfda" [ 1272.152330] env[59490]: _type = "Task" [ 1272.152330] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1272.160590] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]521d4964-a651-7b78-1fb7-da36747cdfda, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1272.210266] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1272.210464] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1272.210615] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleting the datastore file [datastore2] ddbac2db-c555-4554-aa21-7303c8e36371 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1272.210854] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8668f640-5046-4440-bb3a-4111cde35ff8 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.216359] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1272.216359] env[59490]: value = "task-707492" [ 1272.216359] env[59490]: _type = "Task" [ 1272.216359] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1272.223883] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707492, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1272.662678] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1272.663165] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating directory with path [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1272.663211] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a62f9399-ea02-4997-afbd-8727a6628d5c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.673958] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Created directory with path [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1272.674152] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Fetch image to [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1272.674313] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1272.675018] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c58c934c-8c40-4369-b43f-f5e6d1d629e9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.681095] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6e07120-6a83-42fc-a13b-6b4cc2cdc8aa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.689737] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db807ced-3b46-4a3a-b76b-5125981d6f26 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.721596] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f675873-1f44-4645-935c-48534e3cb957 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.729042] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707492, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069971} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1272.730374] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1272.730555] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1272.730736] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1272.730961] env[59490]: INFO nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1272.732904] env[59490]: DEBUG nova.compute.claims [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1272.733068] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1272.733266] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1272.735638] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-5a8937c2-cc2d-4c92-9b3b-d9534c1cee5b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1272.756236] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1272.758960] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1272.759767] env[59490]: DEBUG nova.compute.utils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance ddbac2db-c555-4554-aa21-7303c8e36371 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1272.760865] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1272.761036] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1272.761191] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1272.761347] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1272.761497] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1272.785622] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1272.793461] env[59490]: INFO nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: ddbac2db-c555-4554-aa21-7303c8e36371] Took 0.03 seconds to deallocate network for instance. [ 1272.801169] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1272.855426] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1272.855590] env[59490]: DEBUG oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1272.874662] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "ddbac2db-c555-4554-aa21-7303c8e36371" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 377.427s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1312.379620] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1312.383269] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1312.383418] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1313.384462] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1313.384835] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1313.396134] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1313.396343] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1313.396499] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1313.396676] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1313.397772] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00301173-fd02-41f4-b24e-80400db27b6d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.406138] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71dd501c-8cb2-45ae-966a-59b3d375a433 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.419607] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a615796d-71a3-43bb-a7a0-19738b18f268 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.425337] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-437b34ba-0a53-49d2-9056-4e02436c9498 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.456109] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181659MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1313.456338] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1313.456681] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1313.496780] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Instance bc0157a8-969b-448c-82cf-c773e07d6d02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=59490) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1635}} [ 1313.496982] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 1 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1313.497139] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=640MB phys_disk=200GB used_disk=1GB total_vcpus=48 used_vcpus=1 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1313.524787] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b77b9916-f25c-468e-8c71-be2fde199fa6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.532493] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ea55cbe-5e94-4d0f-8abb-dfe46de60b05 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.561928] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7f65c2d-c719-4ee6-966a-90c4f661d1cb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.569304] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93218a63-e332-455e-9083-d2cc8f527e3b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1313.582159] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1313.590599] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1313.605087] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1313.605303] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.149s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1315.604755] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1315.605166] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1315.605166] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1315.615066] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1315.615206] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1316.383726] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1317.384793] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1318.384968] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1319.589769] env[59490]: WARNING oslo_vmware.rw_handles [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1319.589769] env[59490]: ERROR oslo_vmware.rw_handles [ 1319.590445] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1319.592082] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1319.592345] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Copying Virtual Disk [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/0d7438d8-9600-456d-b367-167b54b6804b/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1319.592637] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a34e2464-d1fe-4070-89ff-706a7a41f9c7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1319.602990] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1319.602990] env[59490]: value = "task-707493" [ 1319.602990] env[59490]: _type = "Task" [ 1319.602990] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1319.610168] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707493, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1320.113623] env[59490]: DEBUG oslo_vmware.exceptions [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1320.113890] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1320.114401] env[59490]: ERROR nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1320.114401] env[59490]: Faults: ['InvalidArgument'] [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] yield resources [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.driver.spawn(context, instance, image_meta, [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._fetch_image_if_missing(context, vi) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] image_cache(vi, tmp_image_ds_loc) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] vm_util.copy_virtual_disk( [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] session._wait_for_task(vmdk_copy_task) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.wait_for_task(task_ref) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return evt.wait() [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] result = hub.switch() [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.greenlet.switch() [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.f(*self.args, **self.kw) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise exceptions.translate_fault(task_info.error) [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Faults: ['InvalidArgument'] [ 1320.114401] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.115457] env[59490]: INFO nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Terminating instance [ 1320.116193] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1320.116389] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1320.116633] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-37b6d094-30df-46b8-b89c-5a931b7fa630 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.118764] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1320.118947] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1320.119654] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37e3ed48-cdf8-4064-9b24-fd8ec2de1e46 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.126249] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1320.126407] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-8714b6d9-2f96-41cf-8969-2255de596cd9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.128459] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1320.128622] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1320.129522] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-27b6b81d-16e3-49e7-a67a-091fdfbf43c2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.133973] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1320.133973] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52420849-42f9-bcfc-8686-1fdd06f261f3" [ 1320.133973] env[59490]: _type = "Task" [ 1320.133973] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1320.140779] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52420849-42f9-bcfc-8686-1fdd06f261f3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1320.194248] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1320.194499] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1320.194666] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleting the datastore file [datastore2] d9c5b959-e509-4d1b-8a0b-de2c58a7626f {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1320.194926] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b62cc00b-25f5-492e-ae4a-8d86360f2991 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.200620] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Waiting for the task: (returnval){ [ 1320.200620] env[59490]: value = "task-707495" [ 1320.200620] env[59490]: _type = "Task" [ 1320.200620] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1320.208021] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707495, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1320.643550] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1320.643931] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] vmware_temp/fec9b980-5a11-4bb5-9081-8f1765fdec09/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1320.644082] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a46e5c6a-8643-44b7-9fe0-33df56e5ac84 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.655012] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] vmware_temp/fec9b980-5a11-4bb5-9081-8f1765fdec09/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1320.655198] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Fetch image to [datastore2] vmware_temp/fec9b980-5a11-4bb5-9081-8f1765fdec09/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1320.655346] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/fec9b980-5a11-4bb5-9081-8f1765fdec09/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1320.656038] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5044d1c4-3888-4325-9fda-d68c5cb0fe78 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.662120] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c84f4563-b853-4d3b-b2f7-58bfd64392a6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.670812] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fe46620-227e-4179-96d6-dcefc7b242ce {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.700759] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7df706f6-d898-42e1-b56d-94d699305af0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.711956] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fa609818-ef59-4c1b-a75e-79fac2e3f720 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.713748] env[59490]: DEBUG oslo_vmware.api [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Task: {'id': task-707495, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076914} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1320.713939] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1320.714123] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1320.714282] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1320.714445] env[59490]: INFO nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1320.716399] env[59490]: DEBUG nova.compute.claims [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1320.716571] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1320.716775] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1320.734044] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1320.740646] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1320.741264] env[59490]: DEBUG nova.compute.utils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1320.742716] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1320.742896] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1320.743066] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1320.743230] env[59490]: DEBUG nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1320.743381] env[59490]: DEBUG nova.network.neutron [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1320.790212] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1320.791027] env[59490]: ERROR nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] result = getattr(controller, method)(*args, **kwargs) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._get(image_id) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] resp, body = self.http_client.get(url, headers=header) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.request(url, 'GET', **kwargs) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._handle_response(resp) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exc.from_response(resp, resp.content) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] During handling of the above exception, another exception occurred: [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] yield resources [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.driver.spawn(context, instance, image_meta, [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._fetch_image_if_missing(context, vi) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image_fetch(context, vi, tmp_image_ds_loc) [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] images.fetch_image( [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1320.791027] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] metadata = IMAGE_API.get(context, image_ref) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return session.show(context, image_id, [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] _reraise_translated_image_exception(image_id) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise new_exc.with_traceback(exc_trace) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] result = getattr(controller, method)(*args, **kwargs) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._get(image_id) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] resp, body = self.http_client.get(url, headers=header) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.request(url, 'GET', **kwargs) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._handle_response(resp) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exc.from_response(resp, resp.content) [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1320.792158] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1320.792158] env[59490]: INFO nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Terminating instance [ 1320.792766] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1320.792981] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1320.793520] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1320.793702] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1320.793922] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80139447-6d65-4e8a-bbf1-be52691ef1e7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.796404] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c50c6856-fe79-4a3c-8e42-2047e9e73c81 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.804844] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1320.805055] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-060630e6-2ed1-47ca-97a0-f8e162c03de2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.807354] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1320.807518] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1320.808437] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e9ce4f6f-9455-4bd3-914b-71ebcc0009e4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.812770] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1320.812770] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52ffac63-88b5-b257-e56e-1545fbaeaa87" [ 1320.812770] env[59490]: _type = "Task" [ 1320.812770] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1320.819792] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52ffac63-88b5-b257-e56e-1545fbaeaa87, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1320.845088] env[59490]: DEBUG neutronclient.v2_0.client [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1320.846478] env[59490]: ERROR nova.compute.manager [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.driver.spawn(context, instance, image_meta, [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._fetch_image_if_missing(context, vi) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] image_cache(vi, tmp_image_ds_loc) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] vm_util.copy_virtual_disk( [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] session._wait_for_task(vmdk_copy_task) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.wait_for_task(task_ref) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return evt.wait() [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] result = hub.switch() [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.greenlet.switch() [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.f(*self.args, **self.kw) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise exceptions.translate_fault(task_info.error) [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Faults: ['InvalidArgument'] [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] During handling of the above exception, another exception occurred: [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._build_and_run_instance(context, instance, image, [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] with excutils.save_and_reraise_exception(): [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.force_reraise() [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1320.846478] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise self.value [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] with self.rt.instance_claim(context, instance, node, allocs, [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.abort() [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return f(*args, **kwargs) [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._unset_instance_host_and_node(instance) [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] instance.save() [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] updates, result = self.indirection_api.object_action( [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return cctxt.call(context, 'object_action', objinst=objinst, [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] result = self.transport._send( [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self._driver.send(target, ctxt, message, [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise result [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] nova.exception_Remote.InstanceNotFound_Remote: Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f could not be found. [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return getattr(target, method)(*args, **kwargs) [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return fn(self, *args, **kwargs) [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] old_ref, inst_ref = db.instance_update_and_get_original( [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return f(*args, **kwargs) [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1320.847583] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] with excutils.save_and_reraise_exception() as ectxt: [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.force_reraise() [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise self.value [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return f(*args, **kwargs) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return f(context, *args, **kwargs) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise exception.InstanceNotFound(instance_id=uuid) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] nova.exception.InstanceNotFound: Instance d9c5b959-e509-4d1b-8a0b-de2c58a7626f could not be found. [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] During handling of the above exception, another exception occurred: [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] exception_handler_v20(status_code, error_body) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise client_exc(message=error_message, [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Neutron server returns request_ids: ['req-963a565e-d21a-4a0d-8ad3-b339f37aba48'] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] During handling of the above exception, another exception occurred: [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] Traceback (most recent call last): [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._deallocate_network(context, instance, requested_networks) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self.network_api.deallocate_for_instance( [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] data = neutron.list_ports(**search_opts) [ 1320.849158] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.list('ports', self.ports_path, retrieve_all, [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] for r in self._pagination(collection, path, **params): [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] res = self.get(path, params=params) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.retry_request("GET", action, body=body, [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] return self.do_request(method, action, body=body, [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] ret = obj(*args, **kwargs) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] self._handle_fault_response(status_code, replybody, resp) [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] raise exception.Unauthorized() [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] nova.exception.Unauthorized: Not authorized. [ 1320.850865] env[59490]: ERROR nova.compute.manager [instance: d9c5b959-e509-4d1b-8a0b-de2c58a7626f] [ 1320.866246] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e52b66ea-f70f-4442-82c1-8be92b718632 tempest-MultipleCreateTestJSON-773842354 tempest-MultipleCreateTestJSON-773842354-project-member] Lock "d9c5b959-e509-4d1b-8a0b-de2c58a7626f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.384s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1320.877988] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1320.878194] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1320.878365] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleting the datastore file [datastore2] f4bbfad2-f118-4292-bb36-4229c333dd4c {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1320.878606] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0286ad57-9d54-4db4-82c1-72b151b0adc6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1320.884790] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1320.884790] env[59490]: value = "task-707497" [ 1320.884790] env[59490]: _type = "Task" [ 1320.884790] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1320.892456] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707497, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1321.322874] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1321.323106] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] vmware_temp/fea0a790-6aea-4a7f-a165-90259090d4c8/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1321.323332] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ce5c4dd-b6ba-4f78-8e75-a9f2c4f22a3b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.335020] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] vmware_temp/fea0a790-6aea-4a7f-a165-90259090d4c8/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1321.335221] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Fetch image to [datastore2] vmware_temp/fea0a790-6aea-4a7f-a165-90259090d4c8/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1321.335383] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/fea0a790-6aea-4a7f-a165-90259090d4c8/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1321.336118] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9a68ce9-a9aa-40ee-82f5-e93ccce3f80e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.342950] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c989e62d-77c7-4400-948a-f21b35c05eca {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.351885] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05feda1c-4e65-46ea-8c3b-becac35f7b42 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.383786] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cf68164-bdb1-43ab-84d5-e239364398d0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.394094] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c5759d32-3d0b-4916-a099-35c4cd9466bf {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.395681] env[59490]: DEBUG oslo_vmware.api [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707497, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.06573} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1321.395923] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1321.396105] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1321.396264] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1321.396436] env[59490]: INFO nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1321.398393] env[59490]: DEBUG nova.compute.claims [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1321.398554] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1321.398755] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1321.421727] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1321.424813] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1321.425650] env[59490]: DEBUG nova.compute.utils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance f4bbfad2-f118-4292-bb36-4229c333dd4c could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1321.426756] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1321.426915] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1321.427078] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1321.427241] env[59490]: DEBUG nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1321.427391] env[59490]: DEBUG nova.network.neutron [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1321.451061] env[59490]: DEBUG neutronclient.v2_0.client [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1321.452573] env[59490]: ERROR nova.compute.manager [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] result = getattr(controller, method)(*args, **kwargs) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._get(image_id) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] resp, body = self.http_client.get(url, headers=header) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.request(url, 'GET', **kwargs) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._handle_response(resp) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exc.from_response(resp, resp.content) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] During handling of the above exception, another exception occurred: [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.driver.spawn(context, instance, image_meta, [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._fetch_image_if_missing(context, vi) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image_fetch(context, vi, tmp_image_ds_loc) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] images.fetch_image( [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] metadata = IMAGE_API.get(context, image_ref) [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return session.show(context, image_id, [ 1321.452573] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] _reraise_translated_image_exception(image_id) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise new_exc.with_traceback(exc_trace) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] result = getattr(controller, method)(*args, **kwargs) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._get(image_id) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] resp, body = self.http_client.get(url, headers=header) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.request(url, 'GET', **kwargs) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._handle_response(resp) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exc.from_response(resp, resp.content) [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] During handling of the above exception, another exception occurred: [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._build_and_run_instance(context, instance, image, [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] with excutils.save_and_reraise_exception(): [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.force_reraise() [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise self.value [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] with self.rt.instance_claim(context, instance, node, allocs, [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.abort() [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1321.453703] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return f(*args, **kwargs) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._unset_instance_host_and_node(instance) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] instance.save() [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] updates, result = self.indirection_api.object_action( [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return cctxt.call(context, 'object_action', objinst=objinst, [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] result = self.transport._send( [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._driver.send(target, ctxt, message, [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise result [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] nova.exception_Remote.InstanceNotFound_Remote: Instance f4bbfad2-f118-4292-bb36-4229c333dd4c could not be found. [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return getattr(target, method)(*args, **kwargs) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return fn(self, *args, **kwargs) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] old_ref, inst_ref = db.instance_update_and_get_original( [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return f(*args, **kwargs) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] with excutils.save_and_reraise_exception() as ectxt: [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.force_reraise() [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise self.value [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return f(*args, **kwargs) [ 1321.454969] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return f(context, *args, **kwargs) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exception.InstanceNotFound(instance_id=uuid) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] nova.exception.InstanceNotFound: Instance f4bbfad2-f118-4292-bb36-4229c333dd4c could not be found. [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] During handling of the above exception, another exception occurred: [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] exception_handler_v20(status_code, error_body) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise client_exc(message=error_message, [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Neutron server returns request_ids: ['req-a4a5edba-7935-47fe-b1c0-73a5809a8c33'] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] During handling of the above exception, another exception occurred: [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] Traceback (most recent call last): [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._deallocate_network(context, instance, requested_networks) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self.network_api.deallocate_for_instance( [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] data = neutron.list_ports(**search_opts) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.list('ports', self.ports_path, retrieve_all, [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1321.456285] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] for r in self._pagination(collection, path, **params): [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] res = self.get(path, params=params) [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.retry_request("GET", action, body=body, [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] return self.do_request(method, action, body=body, [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] ret = obj(*args, **kwargs) [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] self._handle_fault_response(status_code, replybody, resp) [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] raise exception.Unauthorized() [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] nova.exception.Unauthorized: Not authorized. [ 1321.457628] env[59490]: ERROR nova.compute.manager [instance: f4bbfad2-f118-4292-bb36-4229c333dd4c] [ 1321.472221] env[59490]: DEBUG oslo_concurrency.lockutils [None req-371c4e4c-2b91-4ece-ae94-89fb7677a7c8 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "f4bbfad2-f118-4292-bb36-4229c333dd4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 422.595s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1321.517222] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1321.517982] env[59490]: ERROR nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] result = getattr(controller, method)(*args, **kwargs) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._get(image_id) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] resp, body = self.http_client.get(url, headers=header) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.request(url, 'GET', **kwargs) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._handle_response(resp) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exc.from_response(resp, resp.content) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] During handling of the above exception, another exception occurred: [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] yield resources [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.driver.spawn(context, instance, image_meta, [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._fetch_image_if_missing(context, vi) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image_fetch(context, vi, tmp_image_ds_loc) [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] images.fetch_image( [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1321.517982] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] metadata = IMAGE_API.get(context, image_ref) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return session.show(context, image_id, [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] _reraise_translated_image_exception(image_id) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise new_exc.with_traceback(exc_trace) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] result = getattr(controller, method)(*args, **kwargs) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._get(image_id) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] resp, body = self.http_client.get(url, headers=header) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.request(url, 'GET', **kwargs) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._handle_response(resp) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exc.from_response(resp, resp.content) [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1321.519205] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1321.519205] env[59490]: INFO nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Terminating instance [ 1321.520351] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1321.520539] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1321.520820] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1321.521013] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1321.521800] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d8fac42-fbc0-4493-8f8e-c8cca6176621 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.524901] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7bc3df23-1293-4404-b3f9-be56d4c5f0ae {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.531176] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1321.531371] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5e92a100-feab-43a3-a1ff-0804d6a8ab7c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.533431] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1321.533596] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1321.534501] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-41fb2e8b-6b45-429f-8682-efcff6a06515 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.539050] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Waiting for the task: (returnval){ [ 1321.539050] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52d201f8-d951-1fc8-706b-293586cbbbb4" [ 1321.539050] env[59490]: _type = "Task" [ 1321.539050] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1321.546036] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52d201f8-d951-1fc8-706b-293586cbbbb4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1321.596039] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1321.596205] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1321.596376] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleting the datastore file [datastore2] e879cc90-f290-42cd-9059-46f42284a32c {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1321.596647] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-476764a4-6734-4869-8600-174903db8b33 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1321.602414] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1321.602414] env[59490]: value = "task-707499" [ 1321.602414] env[59490]: _type = "Task" [ 1321.602414] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1321.609842] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707499, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.049229] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1322.049500] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Creating directory with path [datastore2] vmware_temp/7ad1b9d8-fc8e-4773-ab1c-414d50cfe300/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.049678] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bc5fc818-8f4f-4558-ac89-085215091297 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.060467] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Created directory with path [datastore2] vmware_temp/7ad1b9d8-fc8e-4773-ab1c-414d50cfe300/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.060641] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Fetch image to [datastore2] vmware_temp/7ad1b9d8-fc8e-4773-ab1c-414d50cfe300/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1322.060803] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/7ad1b9d8-fc8e-4773-ab1c-414d50cfe300/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1322.061536] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82d075b6-1039-4a08-922c-98ae163c53b5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.068028] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c337256-511e-4446-af18-780f6ac569cd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.076648] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e5f19ee-8a60-407c-ab17-c4bbf0eaf7b4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.110505] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf6acfdd-5b18-4aa8-9ee4-fc6d6c8c0547 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.117570] env[59490]: DEBUG oslo_vmware.api [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707499, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076746} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1322.119052] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1322.119240] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1322.119405] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1322.119570] env[59490]: INFO nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1322.121366] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-690aa5c0-35da-4148-9e73-a0b8e5830ff6 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.124821] env[59490]: DEBUG nova.compute.claims [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1322.125031] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1322.125264] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1322.145917] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1322.152152] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.027s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1322.152775] env[59490]: DEBUG nova.compute.utils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance e879cc90-f290-42cd-9059-46f42284a32c could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1322.154121] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1322.154281] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1322.154433] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1322.154591] env[59490]: DEBUG nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1322.154741] env[59490]: DEBUG nova.network.neutron [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1322.179116] env[59490]: DEBUG neutronclient.v2_0.client [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1322.180575] env[59490]: ERROR nova.compute.manager [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: e879cc90-f290-42cd-9059-46f42284a32c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] result = getattr(controller, method)(*args, **kwargs) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._get(image_id) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] resp, body = self.http_client.get(url, headers=header) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.request(url, 'GET', **kwargs) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._handle_response(resp) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exc.from_response(resp, resp.content) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] During handling of the above exception, another exception occurred: [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.driver.spawn(context, instance, image_meta, [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._fetch_image_if_missing(context, vi) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image_fetch(context, vi, tmp_image_ds_loc) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] images.fetch_image( [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] metadata = IMAGE_API.get(context, image_ref) [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return session.show(context, image_id, [ 1322.180575] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] _reraise_translated_image_exception(image_id) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise new_exc.with_traceback(exc_trace) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] result = getattr(controller, method)(*args, **kwargs) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._get(image_id) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] resp, body = self.http_client.get(url, headers=header) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.request(url, 'GET', **kwargs) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._handle_response(resp) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exc.from_response(resp, resp.content) [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] During handling of the above exception, another exception occurred: [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._build_and_run_instance(context, instance, image, [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] with excutils.save_and_reraise_exception(): [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.force_reraise() [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise self.value [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] with self.rt.instance_claim(context, instance, node, allocs, [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.abort() [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1322.181727] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return f(*args, **kwargs) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._unset_instance_host_and_node(instance) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] instance.save() [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] updates, result = self.indirection_api.object_action( [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return cctxt.call(context, 'object_action', objinst=objinst, [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] result = self.transport._send( [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._driver.send(target, ctxt, message, [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise result [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] nova.exception_Remote.InstanceNotFound_Remote: Instance e879cc90-f290-42cd-9059-46f42284a32c could not be found. [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return getattr(target, method)(*args, **kwargs) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return fn(self, *args, **kwargs) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] old_ref, inst_ref = db.instance_update_and_get_original( [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return f(*args, **kwargs) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] with excutils.save_and_reraise_exception() as ectxt: [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.force_reraise() [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise self.value [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return f(*args, **kwargs) [ 1322.182913] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return f(context, *args, **kwargs) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exception.InstanceNotFound(instance_id=uuid) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] nova.exception.InstanceNotFound: Instance e879cc90-f290-42cd-9059-46f42284a32c could not be found. [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] During handling of the above exception, another exception occurred: [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] exception_handler_v20(status_code, error_body) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise client_exc(message=error_message, [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Neutron server returns request_ids: ['req-6dbb7ef8-ede3-424f-90da-cbe682c60fd7'] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] During handling of the above exception, another exception occurred: [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] Traceback (most recent call last): [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._deallocate_network(context, instance, requested_networks) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self.network_api.deallocate_for_instance( [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] data = neutron.list_ports(**search_opts) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.list('ports', self.ports_path, retrieve_all, [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1322.184184] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] for r in self._pagination(collection, path, **params): [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] res = self.get(path, params=params) [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.retry_request("GET", action, body=body, [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] return self.do_request(method, action, body=body, [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] ret = obj(*args, **kwargs) [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] self._handle_fault_response(status_code, replybody, resp) [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] raise exception.Unauthorized() [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] nova.exception.Unauthorized: Not authorized. [ 1322.185533] env[59490]: ERROR nova.compute.manager [instance: e879cc90-f290-42cd-9059-46f42284a32c] [ 1322.200852] env[59490]: DEBUG oslo_concurrency.lockutils [None req-b08b4693-8693-4dc2-90a3-8b43f0ed9cd6 tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "e879cc90-f290-42cd-9059-46f42284a32c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.462s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1322.239412] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1322.240181] env[59490]: ERROR nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] result = getattr(controller, method)(*args, **kwargs) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._get(image_id) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] resp, body = self.http_client.get(url, headers=header) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.request(url, 'GET', **kwargs) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._handle_response(resp) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exc.from_response(resp, resp.content) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] During handling of the above exception, another exception occurred: [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] yield resources [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.driver.spawn(context, instance, image_meta, [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._fetch_image_if_missing(context, vi) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image_fetch(context, vi, tmp_image_ds_loc) [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] images.fetch_image( [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1322.240181] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] metadata = IMAGE_API.get(context, image_ref) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return session.show(context, image_id, [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] _reraise_translated_image_exception(image_id) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise new_exc.with_traceback(exc_trace) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] result = getattr(controller, method)(*args, **kwargs) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._get(image_id) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] resp, body = self.http_client.get(url, headers=header) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.request(url, 'GET', **kwargs) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._handle_response(resp) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exc.from_response(resp, resp.content) [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.241235] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.241235] env[59490]: INFO nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Terminating instance [ 1322.241982] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1322.242201] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.242785] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1322.242963] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1322.243218] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f694940f-0b63-4fe3-8ac8-de4bb6b31f98 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.246247] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55894a69-398b-48f7-a085-dc187e0e93af {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.253184] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1322.254160] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9d21d445-c16c-40f0-8a54-cd55a129ad89 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.255506] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.255670] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1322.256317] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bd2258bc-8354-4a33-ac21-a600f847b40b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.261465] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Waiting for the task: (returnval){ [ 1322.261465] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5289f319-3b65-1886-8038-60f992ff4466" [ 1322.261465] env[59490]: _type = "Task" [ 1322.261465] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1322.268329] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5289f319-3b65-1886-8038-60f992ff4466, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.325753] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1322.325949] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1322.326142] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Deleting the datastore file [datastore2] f6d58f5a-f432-47a2-af63-033ae4c3d414 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1322.326383] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-011c12c1-df0b-4173-be07-8995c1754159 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.332248] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Waiting for the task: (returnval){ [ 1322.332248] env[59490]: value = "task-707501" [ 1322.332248] env[59490]: _type = "Task" [ 1322.332248] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1322.340074] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Task: {'id': task-707501, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1322.383609] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1322.771581] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1322.771826] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Creating directory with path [datastore2] vmware_temp/68ee91ed-5e56-45e1-a371-848a5040f869/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.772051] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f7b5e24c-b24b-4ed4-a0bd-dc529f436a7f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.783121] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Created directory with path [datastore2] vmware_temp/68ee91ed-5e56-45e1-a371-848a5040f869/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.783301] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Fetch image to [datastore2] vmware_temp/68ee91ed-5e56-45e1-a371-848a5040f869/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1322.783461] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/68ee91ed-5e56-45e1-a371-848a5040f869/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1322.784163] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e26eff58-ba55-470b-a471-567b981340fd {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.790420] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48988949-f07c-45f9-a0a6-e6020d8aedbc {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.798970] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77470ef4-0376-4ffc-ab01-42dff59f1034 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.829861] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5aa2c05-d9e9-4b8f-8f08-027580af07c4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.837369] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a21b11cb-c2bf-4f95-ad61-a40538659609 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.841603] env[59490]: DEBUG oslo_vmware.api [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Task: {'id': task-707501, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064662} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1322.842115] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1322.842297] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1322.842461] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1322.842625] env[59490]: INFO nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1322.844702] env[59490]: DEBUG nova.compute.claims [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1322.844865] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1322.845086] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1322.860858] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1322.870015] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.025s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1322.870676] env[59490]: DEBUG nova.compute.utils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1322.872058] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1322.872226] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1322.872385] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1322.872559] env[59490]: DEBUG nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1322.872691] env[59490]: DEBUG nova.network.neutron [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1322.897837] env[59490]: DEBUG neutronclient.v2_0.client [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1322.899303] env[59490]: ERROR nova.compute.manager [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] result = getattr(controller, method)(*args, **kwargs) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._get(image_id) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] resp, body = self.http_client.get(url, headers=header) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.request(url, 'GET', **kwargs) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._handle_response(resp) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exc.from_response(resp, resp.content) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] During handling of the above exception, another exception occurred: [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.driver.spawn(context, instance, image_meta, [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._fetch_image_if_missing(context, vi) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image_fetch(context, vi, tmp_image_ds_loc) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] images.fetch_image( [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] metadata = IMAGE_API.get(context, image_ref) [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return session.show(context, image_id, [ 1322.899303] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] _reraise_translated_image_exception(image_id) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise new_exc.with_traceback(exc_trace) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] result = getattr(controller, method)(*args, **kwargs) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._get(image_id) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] resp, body = self.http_client.get(url, headers=header) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.request(url, 'GET', **kwargs) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._handle_response(resp) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exc.from_response(resp, resp.content) [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] During handling of the above exception, another exception occurred: [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._build_and_run_instance(context, instance, image, [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] with excutils.save_and_reraise_exception(): [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.force_reraise() [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise self.value [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] with self.rt.instance_claim(context, instance, node, allocs, [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.abort() [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1322.900359] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return f(*args, **kwargs) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._unset_instance_host_and_node(instance) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] instance.save() [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] updates, result = self.indirection_api.object_action( [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return cctxt.call(context, 'object_action', objinst=objinst, [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] result = self.transport._send( [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._driver.send(target, ctxt, message, [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise result [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] nova.exception_Remote.InstanceNotFound_Remote: Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 could not be found. [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return getattr(target, method)(*args, **kwargs) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return fn(self, *args, **kwargs) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] old_ref, inst_ref = db.instance_update_and_get_original( [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return f(*args, **kwargs) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] with excutils.save_and_reraise_exception() as ectxt: [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.force_reraise() [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise self.value [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return f(*args, **kwargs) [ 1322.901491] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return f(context, *args, **kwargs) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exception.InstanceNotFound(instance_id=uuid) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] nova.exception.InstanceNotFound: Instance f6d58f5a-f432-47a2-af63-033ae4c3d414 could not be found. [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] During handling of the above exception, another exception occurred: [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] exception_handler_v20(status_code, error_body) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise client_exc(message=error_message, [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Neutron server returns request_ids: ['req-55de8953-a2f0-41ad-958d-e8128f273b32'] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] During handling of the above exception, another exception occurred: [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] Traceback (most recent call last): [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._deallocate_network(context, instance, requested_networks) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self.network_api.deallocate_for_instance( [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] data = neutron.list_ports(**search_opts) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.list('ports', self.ports_path, retrieve_all, [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1322.902758] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] for r in self._pagination(collection, path, **params): [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] res = self.get(path, params=params) [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.retry_request("GET", action, body=body, [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] return self.do_request(method, action, body=body, [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] ret = obj(*args, **kwargs) [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] self._handle_fault_response(status_code, replybody, resp) [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] raise exception.Unauthorized() [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] nova.exception.Unauthorized: Not authorized. [ 1322.903961] env[59490]: ERROR nova.compute.manager [instance: f6d58f5a-f432-47a2-af63-033ae4c3d414] [ 1322.919170] env[59490]: DEBUG oslo_concurrency.lockutils [None req-125e97cf-6bf9-4c75-af7c-26560ec1aa50 tempest-AttachInterfacesTestJSON-650774828 tempest-AttachInterfacesTestJSON-650774828-project-member] Lock "f6d58f5a-f432-47a2-af63-033ae4c3d414" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 425.458s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1322.957301] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1322.957905] env[59490]: ERROR nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] result = getattr(controller, method)(*args, **kwargs) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._get(image_id) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] resp, body = self.http_client.get(url, headers=header) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.request(url, 'GET', **kwargs) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._handle_response(resp) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exc.from_response(resp, resp.content) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] During handling of the above exception, another exception occurred: [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] yield resources [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.driver.spawn(context, instance, image_meta, [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._fetch_image_if_missing(context, vi) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image_fetch(context, vi, tmp_image_ds_loc) [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] images.fetch_image( [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1322.957905] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] metadata = IMAGE_API.get(context, image_ref) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return session.show(context, image_id, [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] _reraise_translated_image_exception(image_id) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise new_exc.with_traceback(exc_trace) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] result = getattr(controller, method)(*args, **kwargs) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._get(image_id) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] resp, body = self.http_client.get(url, headers=header) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.request(url, 'GET', **kwargs) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._handle_response(resp) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exc.from_response(resp, resp.content) [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1322.959105] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1322.959105] env[59490]: INFO nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Terminating instance [ 1322.960221] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1322.960221] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.960816] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1322.961030] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1322.961260] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e2189f1d-a80e-49fd-90aa-af2c8e52fba7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.964162] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-076d7c17-e831-43e0-8029-66aaec0c8463 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.971113] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1322.971336] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1e273e58-9814-48bd-a383-29348b8dc853 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.973466] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1322.973629] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1322.974596] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-be889f5e-b0ea-4d5e-bfb7-e42e5c8ed6e3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1322.979667] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1322.979667] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5295ca30-8929-f9f5-3de9-b68c0260760a" [ 1322.979667] env[59490]: _type = "Task" [ 1322.979667] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1322.994097] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1322.994329] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Creating directory with path [datastore2] vmware_temp/ab65712b-b1f2-448c-93b9-448f80f18a19/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1322.994559] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7bb2e569-b605-4e3e-bfe5-af80699ede22 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.013749] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Created directory with path [datastore2] vmware_temp/ab65712b-b1f2-448c-93b9-448f80f18a19/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1323.013946] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Fetch image to [datastore2] vmware_temp/ab65712b-b1f2-448c-93b9-448f80f18a19/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1323.014123] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/ab65712b-b1f2-448c-93b9-448f80f18a19/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1323.014951] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2b01230-d483-420e-885f-6d6118d22b4a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.021528] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7673320-b31f-4388-9252-cad53a794d01 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.030783] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75214397-ef32-4d23-a1c1-c508e49fdbef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.062913] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-67a194e0-7266-483d-a42f-b6cfecefc169 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.065424] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1323.065606] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1323.065769] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Deleting the datastore file [datastore2] 014bca6d-9df7-4245-90b4-3f291262292a {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1323.065975] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6f9b2fb9-4140-4f9b-a54b-039dd48f09d3 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.070601] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-0cd6aab1-d6e9-4525-94c9-5f9197668a2b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.073298] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Waiting for the task: (returnval){ [ 1323.073298] env[59490]: value = "task-707503" [ 1323.073298] env[59490]: _type = "Task" [ 1323.073298] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1323.081828] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Task: {'id': task-707503, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1323.092493] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1323.187070] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1323.187851] env[59490]: ERROR nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] result = getattr(controller, method)(*args, **kwargs) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._get(image_id) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] resp, body = self.http_client.get(url, headers=header) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.request(url, 'GET', **kwargs) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._handle_response(resp) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exc.from_response(resp, resp.content) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] During handling of the above exception, another exception occurred: [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] yield resources [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.driver.spawn(context, instance, image_meta, [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._fetch_image_if_missing(context, vi) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image_fetch(context, vi, tmp_image_ds_loc) [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] images.fetch_image( [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1323.187851] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] metadata = IMAGE_API.get(context, image_ref) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return session.show(context, image_id, [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] _reraise_translated_image_exception(image_id) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise new_exc.with_traceback(exc_trace) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] result = getattr(controller, method)(*args, **kwargs) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._get(image_id) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] resp, body = self.http_client.get(url, headers=header) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.request(url, 'GET', **kwargs) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._handle_response(resp) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exc.from_response(resp, resp.content) [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1323.188859] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.188859] env[59490]: INFO nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Terminating instance [ 1323.189830] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1323.189905] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1323.190735] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1323.190912] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1323.191143] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f8654596-9f3c-496e-8cb1-89018a3fa2f0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.194008] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f47c80e-9e55-40f1-9564-9254ed9db9e9 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.201149] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1323.201364] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-cfd82e76-417b-48aa-bb96-bc18cc53c64f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.203569] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1323.203733] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1323.204682] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-917e189c-cfd3-4add-9f66-be89fe417314 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.209493] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1323.209493] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52266408-201f-9bc4-aa31-8156a73f04a5" [ 1323.209493] env[59490]: _type = "Task" [ 1323.209493] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1323.216369] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52266408-201f-9bc4-aa31-8156a73f04a5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1323.257716] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1323.257944] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1323.258136] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleting the datastore file [datastore2] d0673be9-d670-4d3f-aefa-26f4e336a695 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1323.258440] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-de72f419-cd46-4e9d-a376-5396c6d728f7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.264297] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Waiting for the task: (returnval){ [ 1323.264297] env[59490]: value = "task-707505" [ 1323.264297] env[59490]: _type = "Task" [ 1323.264297] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1323.272721] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707505, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1323.583497] env[59490]: DEBUG oslo_vmware.api [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Task: {'id': task-707503, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068385} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1323.583704] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1323.583882] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1323.584060] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1323.584233] env[59490]: INFO nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1323.586318] env[59490]: DEBUG nova.compute.claims [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1323.586477] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1323.586690] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1323.611244] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.024s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1323.611891] env[59490]: DEBUG nova.compute.utils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance 014bca6d-9df7-4245-90b4-3f291262292a could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1323.613195] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1323.613353] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1323.613504] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1323.613664] env[59490]: DEBUG nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1323.613814] env[59490]: DEBUG nova.network.neutron [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1323.707893] env[59490]: DEBUG neutronclient.v2_0.client [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1323.710345] env[59490]: ERROR nova.compute.manager [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] result = getattr(controller, method)(*args, **kwargs) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._get(image_id) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] resp, body = self.http_client.get(url, headers=header) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.request(url, 'GET', **kwargs) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._handle_response(resp) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exc.from_response(resp, resp.content) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] During handling of the above exception, another exception occurred: [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.driver.spawn(context, instance, image_meta, [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._fetch_image_if_missing(context, vi) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image_fetch(context, vi, tmp_image_ds_loc) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] images.fetch_image( [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] metadata = IMAGE_API.get(context, image_ref) [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return session.show(context, image_id, [ 1323.710345] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] _reraise_translated_image_exception(image_id) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise new_exc.with_traceback(exc_trace) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] result = getattr(controller, method)(*args, **kwargs) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._get(image_id) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] resp, body = self.http_client.get(url, headers=header) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.request(url, 'GET', **kwargs) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._handle_response(resp) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exc.from_response(resp, resp.content) [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] During handling of the above exception, another exception occurred: [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._build_and_run_instance(context, instance, image, [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] with excutils.save_and_reraise_exception(): [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.force_reraise() [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise self.value [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] with self.rt.instance_claim(context, instance, node, allocs, [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.abort() [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1323.711419] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return f(*args, **kwargs) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._unset_instance_host_and_node(instance) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] instance.save() [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] updates, result = self.indirection_api.object_action( [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return cctxt.call(context, 'object_action', objinst=objinst, [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] result = self.transport._send( [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._driver.send(target, ctxt, message, [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise result [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] nova.exception_Remote.InstanceNotFound_Remote: Instance 014bca6d-9df7-4245-90b4-3f291262292a could not be found. [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return getattr(target, method)(*args, **kwargs) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return fn(self, *args, **kwargs) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] old_ref, inst_ref = db.instance_update_and_get_original( [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return f(*args, **kwargs) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] with excutils.save_and_reraise_exception() as ectxt: [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.force_reraise() [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise self.value [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return f(*args, **kwargs) [ 1323.712562] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return f(context, *args, **kwargs) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exception.InstanceNotFound(instance_id=uuid) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] nova.exception.InstanceNotFound: Instance 014bca6d-9df7-4245-90b4-3f291262292a could not be found. [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] During handling of the above exception, another exception occurred: [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] exception_handler_v20(status_code, error_body) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise client_exc(message=error_message, [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Neutron server returns request_ids: ['req-51ca7924-ae43-4ddd-b0b6-99d13f68e460'] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] During handling of the above exception, another exception occurred: [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] Traceback (most recent call last): [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._deallocate_network(context, instance, requested_networks) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self.network_api.deallocate_for_instance( [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] data = neutron.list_ports(**search_opts) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.list('ports', self.ports_path, retrieve_all, [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1323.713885] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] for r in self._pagination(collection, path, **params): [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] res = self.get(path, params=params) [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.retry_request("GET", action, body=body, [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] return self.do_request(method, action, body=body, [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] ret = obj(*args, **kwargs) [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] self._handle_fault_response(status_code, replybody, resp) [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] raise exception.Unauthorized() [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] nova.exception.Unauthorized: Not authorized. [ 1323.715073] env[59490]: ERROR nova.compute.manager [instance: 014bca6d-9df7-4245-90b4-3f291262292a] [ 1323.722396] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1323.722627] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating directory with path [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1323.724650] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-e5be57bd-0a24-49ef-8298-94a73cd81994 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.732509] env[59490]: DEBUG oslo_concurrency.lockutils [None req-d249e4f9-4bbb-4f66-ba50-e70a8677d611 tempest-AttachVolumeShelveTestJSON-603390732 tempest-AttachVolumeShelveTestJSON-603390732-project-member] Lock "014bca6d-9df7-4245-90b4-3f291262292a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 424.146s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1323.738456] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Created directory with path [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1323.738627] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Fetch image to [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1323.738794] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1323.739521] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a021aec5-1634-43fd-9cd2-028bdd497ace {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.745914] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e4ed065-5009-46b3-add3-ea445167a35e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.754367] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0706fce-841a-413d-b893-bfcd0d13fa6f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.786704] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecd4b468-f602-4bc8-90fb-063b43c47fa4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.793150] env[59490]: DEBUG oslo_vmware.api [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Task: {'id': task-707505, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065379} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1323.794485] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1323.794664] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1323.794826] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1323.794984] env[59490]: INFO nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1323.796844] env[59490]: DEBUG nova.compute.claims [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1323.796997] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1323.797214] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1323.799527] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6366b540-ea57-40b1-9206-440af0c57c36 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1323.819175] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1323.823528] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.026s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1323.824168] env[59490]: DEBUG nova.compute.utils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance d0673be9-d670-4d3f-aefa-26f4e336a695 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1323.825709] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1323.825866] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1323.826030] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1323.826189] env[59490]: DEBUG nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1323.826340] env[59490]: DEBUG nova.network.neutron [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1323.905153] env[59490]: DEBUG oslo_vmware.rw_handles [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1323.954849] env[59490]: DEBUG neutronclient.v2_0.client [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=59490) _handle_fault_response /usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py:262}} [ 1323.956314] env[59490]: ERROR nova.compute.manager [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] result = getattr(controller, method)(*args, **kwargs) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._get(image_id) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] resp, body = self.http_client.get(url, headers=header) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.request(url, 'GET', **kwargs) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._handle_response(resp) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exc.from_response(resp, resp.content) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] During handling of the above exception, another exception occurred: [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.driver.spawn(context, instance, image_meta, [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._fetch_image_if_missing(context, vi) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image_fetch(context, vi, tmp_image_ds_loc) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] images.fetch_image( [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] metadata = IMAGE_API.get(context, image_ref) [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return session.show(context, image_id, [ 1323.956314] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] _reraise_translated_image_exception(image_id) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise new_exc.with_traceback(exc_trace) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] result = getattr(controller, method)(*args, **kwargs) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 197, in get [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._get(image_id) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/utils.py", line 649, in inner [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/v2/images.py", line 190, in _get [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] resp, body = self.http_client.get(url, headers=header) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/keystoneauth1/adapter.py", line 395, in get [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.request(url, 'GET', **kwargs) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 380, in request [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._handle_response(resp) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exc.from_response(resp, resp.content) [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] nova.exception.ImageNotAuthorized: Not authorized for image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9. [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] During handling of the above exception, another exception occurred: [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2426, in _do_build_and_run_instance [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._build_and_run_instance(context, instance, image, [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2621, in _build_and_run_instance [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] with excutils.save_and_reraise_exception(): [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.force_reraise() [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise self.value [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2585, in _build_and_run_instance [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] with self.rt.instance_claim(context, instance, node, allocs, [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/claims.py", line 43, in __exit__ [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.abort() [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/claims.py", line 86, in abort [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.tracker.abort_instance_claim(self.context, self.instance_ref, [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1323.957711] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return f(*args, **kwargs) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 548, in abort_instance_claim [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._unset_instance_host_and_node(instance) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/resource_tracker.py", line 539, in _unset_instance_host_and_node [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] instance.save() [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 209, in wrapper [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] updates, result = self.indirection_api.object_action( [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/conductor/rpcapi.py", line 247, in object_action [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return cctxt.call(context, 'object_action', objinst=objinst, [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/client.py", line 190, in call [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] result = self.transport._send( [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/transport.py", line 123, in _send [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._driver.send(target, ctxt, message, [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self._send(target, ctxt, message, wait_for_reply, timeout, [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 681, in _send [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise result [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] nova.exception_Remote.InstanceNotFound_Remote: Instance d0673be9-d670-4d3f-aefa-26f4e336a695 could not be found. [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/conductor/manager.py", line 142, in _object_dispatch [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return getattr(target, method)(*args, **kwargs) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_versionedobjects/base.py", line 226, in wrapper [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return fn(self, *args, **kwargs) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/objects/instance.py", line 838, in save [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] old_ref, inst_ref = db.instance_update_and_get_original( [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/db/utils.py", line 35, in wrapper [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return f(*args, **kwargs) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 144, in wrapper [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] with excutils.save_and_reraise_exception() as ectxt: [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.force_reraise() [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise self.value [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/oslo_db/api.py", line 142, in wrapper [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return f(*args, **kwargs) [ 1323.958901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/db/main/api.py", line 207, in wrapper [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return f(context, *args, **kwargs) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/db/main/api.py", line 2283, in instance_update_and_get_original [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] instance_ref = _instance_get_by_uuid(context, instance_uuid, [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/db/main/api.py", line 1405, in _instance_get_by_uuid [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exception.InstanceNotFound(instance_id=uuid) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] nova.exception.InstanceNotFound: Instance d0673be9-d670-4d3f-aefa-26f4e336a695 could not be found. [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] During handling of the above exception, another exception occurred: [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] exception_handler_v20(status_code, error_body) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise client_exc(message=error_message, [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Neutron server returns request_ids: ['req-4815da00-19f5-4ed9-830c-1d6d9021162e'] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] During handling of the above exception, another exception occurred: [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] Traceback (most recent call last): [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 3015, in _cleanup_allocated_networks [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._deallocate_network(context, instance, requested_networks) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/compute/manager.py", line 2261, in _deallocate_network [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self.network_api.deallocate_for_instance( [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] data = neutron.list_ports(**search_opts) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.list('ports', self.ports_path, retrieve_all, [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 372, in list [ 1323.960901] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] for r in self._pagination(collection, path, **params): [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] res = self.get(path, params=params) [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 356, in get [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.retry_request("GET", action, body=body, [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] return self.do_request(method, action, body=body, [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] ret = obj(*args, **kwargs) [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/usr/local/lib/python3.10/dist-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] self._handle_fault_response(status_code, replybody, resp) [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] raise exception.Unauthorized() [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] nova.exception.Unauthorized: Not authorized. [ 1323.962294] env[59490]: ERROR nova.compute.manager [instance: d0673be9-d670-4d3f-aefa-26f4e336a695] [ 1323.962294] env[59490]: DEBUG oslo_vmware.rw_handles [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1323.962294] env[59490]: DEBUG oslo_vmware.rw_handles [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1323.980641] env[59490]: DEBUG oslo_concurrency.lockutils [None req-6b0ff0d0-7db2-46d3-9a88-68f5373f331a tempest-ListServerFiltersTestJSON-1231950571 tempest-ListServerFiltersTestJSON-1231950571-project-member] Lock "d0673be9-d670-4d3f-aefa-26f4e336a695" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 424.049s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1367.707960] env[59490]: DEBUG nova.compute.manager [req-ec783dca-b28f-45e4-8cd6-4d7ea5a983bd req-5e25978c-e51b-4c19-bf53-239731e848bf service nova] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Received event network-vif-deleted-d1c4d20b-00d5-4c04-aaf1-0d4a8d304b99 {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1370.517070] env[59490]: WARNING oslo_vmware.rw_handles [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1370.517070] env[59490]: ERROR oslo_vmware.rw_handles [ 1370.517070] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1370.517070] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1370.517070] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Copying Virtual Disk [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/fb47fdce-0d0b-46b8-8eb5-d97be738227c/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1370.518187] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-2fc9b1af-f9de-44b9-97ed-e49c414b49e5 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1370.525923] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1370.525923] env[59490]: value = "task-707506" [ 1370.525923] env[59490]: _type = "Task" [ 1370.525923] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1370.534534] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707506, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1371.038787] env[59490]: DEBUG oslo_vmware.exceptions [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1371.039889] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1371.039889] env[59490]: ERROR nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1371.039889] env[59490]: Faults: ['InvalidArgument'] [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Traceback (most recent call last): [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] yield resources [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] self.driver.spawn(context, instance, image_meta, [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] self._fetch_image_if_missing(context, vi) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] image_cache(vi, tmp_image_ds_loc) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] vm_util.copy_virtual_disk( [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] session._wait_for_task(vmdk_copy_task) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] return self.wait_for_task(task_ref) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] return evt.wait() [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] result = hub.switch() [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] return self.greenlet.switch() [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] self.f(*self.args, **self.kw) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] raise exceptions.translate_fault(task_info.error) [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Faults: ['InvalidArgument'] [ 1371.039889] env[59490]: ERROR nova.compute.manager [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] [ 1371.041062] env[59490]: INFO nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Terminating instance [ 1371.044993] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1371.045195] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1371.045951] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39287242-20cd-4179-89a0-b2f00b06b81e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.053329] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1371.053554] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-316f2048-cf65-4f88-bac6-b0c37bbeb70e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.111680] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1371.111882] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1371.112067] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleting the datastore file [datastore2] bc0157a8-969b-448c-82cf-c773e07d6d02 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1371.112322] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-ba458d3e-7365-41b2-8d54-7206f4d83bec {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1371.119622] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Waiting for the task: (returnval){ [ 1371.119622] env[59490]: value = "task-707508" [ 1371.119622] env[59490]: _type = "Task" [ 1371.119622] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1371.127646] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707508, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1371.629192] env[59490]: DEBUG oslo_vmware.api [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Task: {'id': task-707508, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069392} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1371.629655] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1371.629817] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1371.630040] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1371.630263] env[59490]: INFO nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1371.632709] env[59490]: DEBUG nova.compute.claims [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1371.632932] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1371.633226] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1371.661528] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.028s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1371.662300] env[59490]: DEBUG nova.compute.utils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance bc0157a8-969b-448c-82cf-c773e07d6d02 could not be found. {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1371.664327] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance disappeared during build. {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2483}} [ 1371.664488] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1371.664641] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1371.664801] env[59490]: DEBUG nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1371.664954] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1371.699944] env[59490]: DEBUG nova.network.neutron [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1371.711934] env[59490]: INFO nova.compute.manager [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Took 0.05 seconds to deallocate network for instance. [ 1371.762854] env[59490]: DEBUG oslo_concurrency.lockutils [None req-cb6c856f-0962-44ad-8d6b-ea181e5e6c23 tempest-DeleteServersTestJSON-437756230 tempest-DeleteServersTestJSON-437756230-project-member] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.872s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1371.763080] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 191.151s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1371.763258] env[59490]: INFO nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] During sync_power_state the instance has a pending task (spawning). Skip. [ 1371.763420] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "bc0157a8-969b-448c-82cf-c773e07d6d02" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1372.379187] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1373.385053] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1373.385053] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1373.385053] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1373.397877] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1373.398108] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1373.398264] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1373.398413] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1373.399648] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cca404da-353c-44cb-84eb-a58555244720 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.408693] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11d989b2-326e-4cc4-abe4-16acb8cae670 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.422379] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1889926-6e56-4ad7-b98f-c5013cde5d25 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.428639] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c6c490d-2566-4a50-940f-b422e9fcd7a2 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.457914] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181674MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1373.458063] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1373.458240] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1373.488335] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1373.488595] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1373.502205] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e1abbf7-9c90-4587-8d47-0af452a93d93 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.509634] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-181fbbb4-855d-4034-b64f-34e494749daa {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.538431] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccacfc2d-ecef-41e3-bf79-1acba77d341b {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.545169] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-482e8ae1-ada8-4b10-8e14-989954aa5fe0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1373.558108] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1373.565734] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1373.578011] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1373.578189] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1374.578511] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1375.584990] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1375.584990] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1375.595466] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Starting instance... {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2402}} [ 1375.641636] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1375.641860] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1375.643335] env[59490]: INFO nova.compute.claims [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1375.712548] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c3c238c-b5ee-4490-87fa-6a7c13648f8d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.720632] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ae1cc9b-7bce-4a80-a7bf-86ffe4254836 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.749509] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96fbb411-66b0-4739-9bc4-85193380ecef {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.756599] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76d43765-3826-467f-88ed-e6700babcf15 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.769396] env[59490]: DEBUG nova.compute.provider_tree [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1375.779424] env[59490]: DEBUG nova.scheduler.client.report [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1375.791660] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.150s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1375.792141] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Start building networks asynchronously for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2799}} [ 1375.828642] env[59490]: DEBUG nova.compute.utils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Using /dev/sd instead of None {{(pid=59490) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1375.829892] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Allocating IP information in the background. {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1375.830062] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] allocate_for_instance() {{(pid=59490) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1375.838627] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Start building block device mappings for instance. {{(pid=59490) _build_resources /opt/stack/nova/nova/compute/manager.py:2834}} [ 1375.882648] env[59490]: DEBUG nova.policy [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4853156cd0fa4d9fb8608ea3e37b1c72', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb07fa15c61045949b56549f7d5b41e6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=59490) authorize /opt/stack/nova/nova/policy.py:203}} [ 1375.899935] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Start spawning the instance on the hypervisor. {{(pid=59490) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2608}} [ 1375.919937] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Getting desirable topologies for flavor Flavor(created_at=2024-09-07T10:18:50Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2024-09-07T10:18:32Z,direct_url=,disk_format='vmdk',id=2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='cd409d4c0a16442ab4835869a1e0850d',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2024-09-07T10:18:33Z,virtual_size=,visibility=), allow threads: False {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1375.920186] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Flavor limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1375.920335] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Image limits 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1375.920512] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Flavor pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1375.920644] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Image pref 0:0:0 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1375.920849] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=59490) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1375.921100] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1375.921260] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1375.921420] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Got 1 possible topologies {{(pid=59490) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1375.921574] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1375.921739] env[59490]: DEBUG nova.virt.hardware [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=59490) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1375.922811] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6ab6c3a-1962-4a19-826b-2b806da62812 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1375.931540] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-183056b7-28c3-4f6d-94de-7dfb9018fdda {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1376.167365] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Successfully created port: 3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1376.384633] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1376.384852] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1376.385046] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1376.394745] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Skipping network cache update for instance because it is Building. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9825}} [ 1376.394881] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1376.632059] env[59490]: DEBUG nova.compute.manager [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Received event network-vif-plugged-3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1376.632359] env[59490]: DEBUG oslo_concurrency.lockutils [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] Acquiring lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1376.632479] env[59490]: DEBUG oslo_concurrency.lockutils [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] Lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1376.632661] env[59490]: DEBUG oslo_concurrency.lockutils [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] Lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1376.632852] env[59490]: DEBUG nova.compute.manager [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] No waiting events found dispatching network-vif-plugged-3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1376.633025] env[59490]: WARNING nova.compute.manager [req-121c75f8-c5f0-41c8-9003-b67cd7a2747f req-3ad6db36-aae1-4b11-86ea-f9a3bb1489c0 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Received unexpected event network-vif-plugged-3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb for instance with vm_state building and task_state spawning. [ 1376.705699] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Successfully updated port: 3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1376.716725] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1376.716873] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquired lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1376.717025] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Building network info cache for instance {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1376.750051] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Instance cache missing network info. {{(pid=59490) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1376.895523] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Updating instance_info_cache with network_info: [{"id": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "address": "fa:16:3e:0b:f0:e9", "network": {"id": "78937322-7521-4b8e-ad50-d0323426b232", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1579187499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb07fa15c61045949b56549f7d5b41e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3844ddf9-c1", "ovs_interfaceid": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1376.907340] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Releasing lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1376.907616] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Instance network_info: |[{"id": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "address": "fa:16:3e:0b:f0:e9", "network": {"id": "78937322-7521-4b8e-ad50-d0323426b232", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1579187499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb07fa15c61045949b56549f7d5b41e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3844ddf9-c1", "ovs_interfaceid": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=59490) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1967}} [ 1376.907961] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0b:f0:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '04ccbc7a-cf8d-4ea2-8411-291a1e27df7b', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb', 'vif_model': 'vmxnet3'}] {{(pid=59490) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1376.915494] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Creating folder: Project (bb07fa15c61045949b56549f7d5b41e6). Parent ref: group-v168905. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1376.915958] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-36f69fb7-97e6-4764-800a-043ee396c13c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1376.926874] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Created folder: Project (bb07fa15c61045949b56549f7d5b41e6) in parent group-v168905. [ 1376.926998] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Creating folder: Instances. Parent ref: group-v168982. {{(pid=59490) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1376.927142] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7d0c64b8-ecfc-40ec-95e1-b1fe1ee54d61 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1376.936408] env[59490]: INFO nova.virt.vmwareapi.vm_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Created folder: Instances in parent group-v168982. [ 1376.936661] env[59490]: DEBUG oslo.service.loopingcall [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=59490) func /usr/local/lib/python3.10/dist-packages/oslo_service/loopingcall.py:435}} [ 1376.936867] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Creating VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1376.937078] env[59490]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ea2448af-bb70-42ad-82d4-4f2c390d6e0c {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1376.955165] env[59490]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1376.955165] env[59490]: value = "task-707511" [ 1376.955165] env[59490]: _type = "Task" [ 1376.955165] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1376.962016] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707511, 'name': CreateVM_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1377.384287] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1377.464866] env[59490]: DEBUG oslo_vmware.api [-] Task: {'id': task-707511, 'name': CreateVM_Task, 'duration_secs': 0.309784} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1377.465160] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Created VM on the ESX host {{(pid=59490) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1377.472631] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1377.472631] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1377.472778] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquired external semaphore "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:323}} [ 1377.472993] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f1af262d-cdb8-4fbc-96fa-408c28f4eba4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1377.478061] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Waiting for the task: (returnval){ [ 1377.478061] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]5208fb4c-1326-bc63-baf1-67c789b8ae21" [ 1377.478061] env[59490]: _type = "Task" [ 1377.478061] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1377.486508] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]5208fb4c-1326-bc63-baf1-67c789b8ae21, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1377.988696] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1377.989100] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Processing image 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1377.989186] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1377.989293] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquired lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1377.989466] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Creating directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1377.989691] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ad5bd57-4fe9-43b0-90c4-baa981d37470 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1377.996762] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Created directory with path [datastore2] devstack-image-cache_base {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1377.996923] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Folder [datastore2] devstack-image-cache_base created. {{(pid=59490) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1377.997593] env[59490]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d4c0a08e-5428-411a-a6ff-0d950b9acb3e {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.002247] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Waiting for the task: (returnval){ [ 1378.002247] env[59490]: value = "session[5210d47d-495b-8849-c195-d8b439f95142]52b6b184-6363-7938-baae-841ed57d1cb6" [ 1378.002247] env[59490]: _type = "Task" [ 1378.002247] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1378.009302] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Task: {'id': session[5210d47d-495b-8849-c195-d8b439f95142]52b6b184-6363-7938-baae-841ed57d1cb6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1378.512226] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Preparing fetch location {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1378.512434] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Creating directory with path [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1378.512659] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-900011b6-0c40-41e0-a499-5fa938b36474 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.532760] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Created directory with path [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 {{(pid=59490) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1378.532929] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Fetch image to [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1378.533101] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1378.533805] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec1ee9b-6d0b-4932-bdb0-4069633d0d7f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.540547] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0ef3211-9ad7-4bc7-92f8-4e3726c53f04 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.550792] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14b5738b-3f2b-40ea-bf3f-bcb1111dbe9f {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.579710] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78bfd25b-8c2b-4b53-a43f-5697ae80f859 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.584838] env[59490]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-659b652f-1070-4a44-9612-15edc3ff5338 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1378.604157] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Downloading image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1378.647101] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) _create_write_connection /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:122}} [ 1378.699791] env[59490]: DEBUG nova.compute.manager [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Received event network-changed-3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11002}} [ 1378.699978] env[59490]: DEBUG nova.compute.manager [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Refreshing instance network info cache due to event network-changed-3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb. {{(pid=59490) external_instance_event /opt/stack/nova/nova/compute/manager.py:11007}} [ 1378.700221] env[59490]: DEBUG oslo_concurrency.lockutils [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] Acquiring lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:312}} [ 1378.700380] env[59490]: DEBUG oslo_concurrency.lockutils [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] Acquired lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:315}} [ 1378.701109] env[59490]: DEBUG nova.network.neutron [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Refreshing network info cache for port 3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb {{(pid=59490) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1378.703641] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Completed reading data from the image iterator. {{(pid=59490) read /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:765}} [ 1378.703815] env[59490]: DEBUG oslo_vmware.rw_handles [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore2. {{(pid=59490) close /usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py:281}} [ 1378.964992] env[59490]: DEBUG nova.network.neutron [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Updated VIF entry in instance network info cache for port 3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb. {{(pid=59490) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1378.964992] env[59490]: DEBUG nova.network.neutron [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Updating instance_info_cache with network_info: [{"id": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "address": "fa:16:3e:0b:f0:e9", "network": {"id": "78937322-7521-4b8e-ad50-d0323426b232", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1579187499-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "bb07fa15c61045949b56549f7d5b41e6", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "04ccbc7a-cf8d-4ea2-8411-291a1e27df7b", "external-id": "nsx-vlan-transportzone-998", "segmentation_id": 998, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3844ddf9-c1", "ovs_interfaceid": "3844ddf9-c1d7-4ef5-9828-26a78e7fcdfb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1378.973538] env[59490]: DEBUG oslo_concurrency.lockutils [req-2c883910-171a-498d-99ba-aa8046cbc231 req-391597fc-208a-4c52-9e95-96385c5685a1 service nova] Releasing lock "refresh_cache-e096d5cd-a645-4361-b3f0-7ba4b80a6f52" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1379.383593] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1379.383961] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1383.383849] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1384.379381] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1425.839714] env[59490]: WARNING oslo_vmware.rw_handles [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles response.begin() [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1425.839714] env[59490]: ERROR oslo_vmware.rw_handles [ 1425.840481] env[59490]: DEBUG nova.virt.vmwareapi.images [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Downloaded image file data 2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9 to vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk on the data store datastore2 {{(pid=59490) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1425.841852] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Caching image {{(pid=59490) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1425.842144] env[59490]: DEBUG nova.virt.vmwareapi.vm_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Copying Virtual Disk [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/tmp-sparse.vmdk to [datastore2] vmware_temp/55c981b3-65c2-4940-9eba-44fd7f711bf2/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk {{(pid=59490) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1425.842446] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-f2d83898-cc6d-417b-b20c-bfd2a165509d {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1425.850816] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Waiting for the task: (returnval){ [ 1425.850816] env[59490]: value = "task-707512" [ 1425.850816] env[59490]: _type = "Task" [ 1425.850816] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1425.859109] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Task: {'id': task-707512, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1426.360905] env[59490]: DEBUG oslo_vmware.exceptions [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Fault InvalidArgument not matched. {{(pid=59490) get_fault_class /usr/local/lib/python3.10/dist-packages/oslo_vmware/exceptions.py:290}} [ 1426.362781] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Releasing lock "[datastore2] devstack-image-cache_base/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9/2ec2b44f-128f-4fbb-8ee4-370acf2ac4a9.vmdk" {{(pid=59490) lock /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:333}} [ 1426.362781] env[59490]: ERROR nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1426.362781] env[59490]: Faults: ['InvalidArgument'] [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Traceback (most recent call last): [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/compute/manager.py", line 2864, in _build_resources [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] yield resources [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self.driver.spawn(context, instance, image_meta, [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self._fetch_image_if_missing(context, vi) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] image_cache(vi, tmp_image_ds_loc) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] vm_util.copy_virtual_disk( [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] session._wait_for_task(vmdk_copy_task) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return self.wait_for_task(task_ref) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return evt.wait() [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] result = hub.switch() [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return self.greenlet.switch() [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self.f(*self.args, **self.kw) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] raise exceptions.translate_fault(task_info.error) [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Faults: ['InvalidArgument'] [ 1426.362781] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] [ 1426.362781] env[59490]: INFO nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Terminating instance [ 1426.365589] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Start destroying the instance on the hypervisor. {{(pid=59490) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3120}} [ 1426.365777] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Destroying instance {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1426.366535] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf54f00f-3e40-4edb-9fcc-7186edc14527 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1426.373246] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Unregistering the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1426.373444] env[59490]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-15508f19-27e1-4981-b224-c5c6afa616eb {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1426.433670] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Unregistered the VM {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1426.433883] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Deleting contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1426.434072] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Deleting the datastore file [datastore2] e096d5cd-a645-4361-b3f0-7ba4b80a6f52 {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1426.434334] env[59490]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-65af8ea1-c9ff-4df8-8776-dc9700109172 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1426.441195] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Waiting for the task: (returnval){ [ 1426.441195] env[59490]: value = "task-707514" [ 1426.441195] env[59490]: _type = "Task" [ 1426.441195] env[59490]: } to complete. {{(pid=59490) wait_for_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:397}} [ 1426.448717] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Task: {'id': task-707514, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:434}} [ 1426.950560] env[59490]: DEBUG oslo_vmware.api [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Task: {'id': task-707514, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.068798} completed successfully. {{(pid=59490) _poll_task /usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py:444}} [ 1426.950906] env[59490]: DEBUG nova.virt.vmwareapi.ds_util [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Deleted the datastore file {{(pid=59490) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1426.950954] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Deleted contents of the VM from datastore datastore2 {{(pid=59490) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1426.951148] env[59490]: DEBUG nova.virt.vmwareapi.vmops [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Instance destroyed {{(pid=59490) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1426.951316] env[59490]: INFO nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Took 0.59 seconds to destroy the instance on the hypervisor. [ 1426.953301] env[59490]: DEBUG nova.compute.claims [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Aborting claim: {{(pid=59490) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1426.953460] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1426.953665] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1427.014385] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70a1baf8-e46b-413b-96a3-c58449a933e7 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1427.021619] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e6f26e8-07f4-4df9-afbf-cdde017e6146 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1427.052055] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-234b0514-6de4-445d-8cf8-5008adae2147 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1427.058832] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5054ddb-36e7-4d98-a929-75df6bc0be0a {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1427.071639] env[59490]: DEBUG nova.compute.provider_tree [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1427.079501] env[59490]: DEBUG nova.scheduler.client.report [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1427.091579] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.138s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1427.092087] env[59490]: ERROR nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1427.092087] env[59490]: Faults: ['InvalidArgument'] [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Traceback (most recent call last): [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/compute/manager.py", line 2611, in _build_and_run_instance [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self.driver.spawn(context, instance, image_meta, [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 529, in spawn [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self._fetch_image_if_missing(context, vi) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] image_cache(vi, tmp_image_ds_loc) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] vm_util.copy_virtual_disk( [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] session._wait_for_task(vmdk_copy_task) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return self.wait_for_task(task_ref) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return evt.wait() [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/eventlet/event.py", line 125, in wait [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] result = hub.switch() [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/eventlet/hubs/hub.py", line 313, in switch [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] return self.greenlet.switch() [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] self.f(*self.args, **self.kw) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] File "/usr/local/lib/python3.10/dist-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] raise exceptions.translate_fault(task_info.error) [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Faults: ['InvalidArgument'] [ 1427.092087] env[59490]: ERROR nova.compute.manager [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] [ 1427.092980] env[59490]: DEBUG nova.compute.utils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] VimFaultException {{(pid=59490) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1427.094106] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Build of instance e096d5cd-a645-4361-b3f0-7ba4b80a6f52 was re-scheduled: A specified parameter was not correct: fileType [ 1427.094106] env[59490]: Faults: ['InvalidArgument'] {{(pid=59490) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2450}} [ 1427.094465] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Unplugging VIFs for instance {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2976}} [ 1427.094627] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=59490) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2999}} [ 1427.094788] env[59490]: DEBUG nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Deallocating network for instance {{(pid=59490) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1427.094943] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] deallocate_for_instance() {{(pid=59490) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1427.346128] env[59490]: DEBUG nova.network.neutron [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Updating instance_info_cache with network_info: [] {{(pid=59490) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1427.358194] env[59490]: INFO nova.compute.manager [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] [instance: e096d5cd-a645-4361-b3f0-7ba4b80a6f52] Took 0.26 seconds to deallocate network for instance. [ 1427.443134] env[59490]: INFO nova.scheduler.client.report [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Deleted allocations for instance e096d5cd-a645-4361-b3f0-7ba4b80a6f52 [ 1427.460396] env[59490]: DEBUG oslo_concurrency.lockutils [None req-5a16a01a-1066-46f6-9aec-f8dfd5c65530 tempest-AttachVolumeTestJSON-388450277 tempest-AttachVolumeTestJSON-388450277-project-member] Lock "e096d5cd-a645-4361-b3f0-7ba4b80a6f52" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 51.876s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1432.391154] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1433.384953] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1433.385226] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=59490) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10431}} [ 1434.384527] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager.update_available_resource {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1434.394549] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1434.394757] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1434.394915] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1434.395153] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=59490) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:861}} [ 1434.396183] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e86b4bfc-80e3-4432-aa05-eebbd0d66946 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.404924] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da3d08a9-46ca-40fe-820a-de1670c20af1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.418301] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c57046cc-6d77-4dc4-bfd3-f903158ac1f0 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.424285] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-297c4fef-eb0a-4d7f-980d-da1113124fee {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.453132] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=181667MB free_disk=80GB free_vcpus=48 pci_devices=None {{(pid=59490) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1034}} [ 1434.453295] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:404}} [ 1434.453440] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:409}} [ 1434.482264] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1057}} [ 1434.482430] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=59490) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1066}} [ 1434.496133] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c363e8a0-6b8f-42f2-ad32-9282448f48ea {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.502884] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf72d506-3ccb-49e4-aa75-d52d85ae8770 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.531685] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e689326c-bdb8-4c13-9cf4-74c60b5cf3c1 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.538714] env[59490]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c230a85-0e13-46c6-a65a-c4eda68bded4 {{(pid=59490) request_handler /usr/local/lib/python3.10/dist-packages/oslo_vmware/service.py:371}} [ 1434.551569] env[59490]: DEBUG nova.compute.provider_tree [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed in ProviderTree for provider: 715aacdb-6e76-47b7-ae6f-492abc122a20 {{(pid=59490) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1434.559996] env[59490]: DEBUG nova.scheduler.client.report [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Inventory has not changed for provider 715aacdb-6e76-47b7-ae6f-492abc122a20 based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 80, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=59490) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1434.573761] env[59490]: DEBUG nova.compute.resource_tracker [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=59490) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:995}} [ 1434.573924] env[59490]: DEBUG oslo_concurrency.lockutils [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s {{(pid=59490) inner /usr/local/lib/python3.10/dist-packages/oslo_concurrency/lockutils.py:423}} [ 1436.573590] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1438.385521] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1438.385888] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Starting heal instance info cache {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9812}} [ 1438.385888] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Rebuilding the list of instances to heal {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9816}} [ 1438.394320] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Didn't find any instances for network info cache update. {{(pid=59490) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9898}} [ 1438.394499] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1439.384352] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1439.384633] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances with incomplete migration {{(pid=59490) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11137}} [ 1439.391250] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1440.396957] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1441.384639] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1441.384945] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}} [ 1441.385102] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Cleaning up deleted instances {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11099}} [ 1441.398623] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] There are 1 instances to clean {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11108}} [ 1441.398920] env[59490]: DEBUG nova.compute.manager [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] [instance: bc0157a8-969b-448c-82cf-c773e07d6d02] Instance has had 0 of 5 cleanup attempts {{(pid=59490) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11112}} [ 1445.417630] env[59490]: DEBUG oslo_service.periodic_task [None req-e8166bad-b609-4a80-a649-a0403a5064a1 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=59490) run_periodic_tasks /usr/local/lib/python3.10/dist-packages/oslo_service/periodic_task.py:210}}